Last month, a court in Kenya issued a landmark ruling against Meta, owner of Facebook and Instagram. The US tech giant was, the court ruled, the “true employer” of the hundreds of people employed in Nairobi as moderators on its platforms, trawling through posts and images to filter out violence, hate speech and other shocking content. That means Meta can be sued in Kenya for labor rights violations, even though moderators are technically employed by a third party contractor.
Social media giant TikTok was watching the case closely. The company also uses outsourced moderators in Kenya, and in other countries in the global south, through a contract with Luxembourg-based Majorel. Leaked documents obtained by the NGO Foxglove Legal, seen by WIRED, show that TikTok is concerned it could be next in line for possible litigation.
“TikTok will likely face reputational and regulatory risks for its contractual arrangement with Majorel in Kenya,” the memo says. If the Kenyan courts rule in the moderators’ favor, the memo warns “TikTok and its competitors could face scrutiny for real or perceived labor rights violations.”
The ruling against Meta came after the tech company tried to get the court to dismiss a case brought against it and its outsourcing partner, Sama, by the South African moderator, Daniel Motaung, who was fired after trying to form a union in 2019.
Motaung said the work, which meant watching hours of violent, graphic, or otherwise traumatizing content daily, left him with post-traumatic stress disorder. He also alleged that he hadn’t been fully informed about the nature of the work before he’d relocated from South Africa to Kenya to start the job. Motaung accuses Meta and Sama of several abuses of Kenyan labor law, including human trafficking and union busting. Should Motaung’s case succeed, it could allow other large tech companies that outsource to Kenya to be held accountable for the way staff there are treated, and provide a framework for similar cases in other countries.
“[TikTok] reads it as a reputational threat,” says Cori Crider, director of Foxglove Legal. “The fact that they are exploiting people is the reputational threat.”
TikTok did not respond to a request for comment.
In January, as Motaung’s lawsuit progressed, Meta attempted to cut ties with Sama and move its outsourcing operations to Majorel—TikTok’s partner.
In the process, 260 Sama moderators were expected to lose their jobs. In March, a judge issued an injunction preventing Meta from terminating its contract with Sama and moving it to Majorel until the court was able to determine whether the layoffs violated Kenyan labor laws. In a separate lawsuit, Sama moderators, some of whom spoke to WIRED earlier this year, alleged that Majorel had blacklisted them from applying to the new Meta moderator jobs, in retaliation for trying to push for better working conditions at Sama. In May, 150 outsourced moderators working for TikTok, ChatGPT, and Meta via third-party companies voted to form and register the African Content Moderators Union.
Majorel did not respond to a request for comment.
The TikTok documents show that the company is considering an independent audit of Majorel’s site in Kenya. Majorel has sites around the world, including in Morocco, where its moderators work for both Meta and TikTok. Such an exercise, which often involves hiring an outside law firm or consultancy to conduct interviews and deliver a formal assessment against criteria like local labor laws or international human rights standards, “may mitigate additional scrutiny from union representatives and news media,” the memo said.
Paul Barrett, deputy director of the Center for Business and Human Rights at New York University, says that these audits can be a way for companies to look like they’re taking action to improve conditions in their supply chain, without having to make the drastic changes they need.
“There have been instances in a number of industries where audits have been largely performative, just a little bit of theater to give to a global company a gold star so that they can say they’re complying with all relevant standards,” he says, noting that it’s difficult to tell in advance whether a potential audit of TikTok’s moderation operations would be similarly cosmetic.
Meta has conducted multiple audits, including in 2018, contracting consultants Business for Social Responsibility to assess its human rights impact in Myanmar in the wake of a genocide that UN investigators alleged was partially fueled by hate speech on Facebook. Last year, Meta released its first human rights report. The company has, however, repeatedly delayed the release of the full, unredacted copy of its human rights impact report on India, commissioned in 2019 following pressure from rights groups who accused it of contributing to the erosion of civil liberties in the country.
The TikTok memo says nothing about how the company might use such an assessment to help guide improvements in the material conditions of its outsourced workers. “You’ll note, the recommendations are not suddenly saying, they should just give people access to psychiatrists, or allow them to opt out of the toxic content and prescreen them in a very careful way, or to pay them in a more equal way that acknowledges the inherent hazards of the job,” says Crider. “I think it’s about the performance of doing something.”
Barrett says that there is an opportunity for TikTok to approach the issue in a more proactive way than its predecessors have. “I think it would be very unfortunate if TikTok said, ‘We’re going to try to minimize liability, minimize our responsibility, and not only outsource this work, but outsource our responsibility for making sure the work that’s being done on behalf of our platform is done in an appropriate and humane way.’”