Meta, Facebook’s parent company, has been in the media’s eye for several reasons in the last few years. Whether it’s for a controversial dive into exploring the metaverse or having a myriad of lawsuits, the drama never stops. Just last December, Meta agreed to pay $725 million after losing a class-action lawsuit that claimed it gave third parties access to user data without their consent. And now, the social media giant is in hot water again, this time in Africa.

Last year, Meta and Sama, its main subcontractor for content moderation in Africa, received a lawsuit from a Kenyan law firm. The firm represents South African national Daniel Motaung, a former content moderator and Facebook whistleblower who was allegedly laid off for organizing a 2019 strike and trying to unionize Sama’s employees. The allegations against Meta and Sama claim that Meta and Sama “subjected current and former content moderators to forced labour and human trafficking for labour.” Additionally, the productivity of Sama’s employees was tracked using Meta’s software, which measured employee screen time and movement during work hours. The lawyers also claimed that Sama had a deceptive recruitment process.

Meta had tried to dodge the case by saying they were a foreign company and Kenya had no jurisdiction over them. However, the judge determined that Meta will remain as a party in the case, stating that the company is liable due to some aspects of its operations in Kenya. The full ruling is set to be published soon.

Sama provides content moderation services for Meta’s platforms, including Facebook, Instagram, and WhatsApp, but has faced criticism for creating a toxic work environment and denying mental health support to moderators. Notably, Sama is the same company that outsourced for OpenAI, which is also in the middle of another lawsuit for exploiting Kenyan workers.

In response to the lawsuit, Sama announced in January that they would close their content moderation hub in Kenya. However, Motaung is seeking financial compensation for himself and other moderators and wants Meta and Sama to stop union busting and provide mental health support, amongst other demands.

Meta’s platforms have massively impacted people’s lives and how we communicate. They provide a space for people to connect, share ideas, and express themselves. However, with such a massive reach, companies like Meta must take responsibility for the content shared on their platforms. So it’s no surprise that Meta has received other lawsuits concerning content moderation. For instance, last year, Nigeria’s Advertising Regulatory Council (ARCON) sued the company for showing unapproved adverts to Nigerians.

This is where Sama comes in. As Meta’s main subcontractor for content moderation in Africa, Sama is responsible for removing content that promotes hate, misinformation, and violence. However, it seems that the company has fallen short of this responsibility and has been accused of creating a toxic work environment and denying mental health support to moderators.

It may be a stretch to imply that this lawsuit would be the fall of a giant. However, it might be a big upset in the growing trend of exploiting remote workers from Africa. It will, without a doubt, spark a larger discussion about the need for better working conditions for these workers.

Elsewhere on Ventures

Triangle arrow