Starting March 2023, Kenya-based Sama, a third-party contractor for Meta—Facebook's parent company has announced that it will discontinue its content moderation operations for the big tech.
The company which will also be laying off 3% of its workforce cited the "current economic climate" as the reason for the decision. According to Sama staff, the company would focus solely on labelling work—also known as "computer vision data annotation"—which includes positioning animations in augmented reality filters.
Sama disclosed that it will offer mental health support to affected employees 12 months after the end of their employment, including undisclosed severance packages.
However, within two years, the Nairobi office was moderating some of the most graphic and harmful material on Meta's platforms, including beheadings and child abuse.
"We respect Sama's decision to exit the content review services it provides to social media platforms. We will work with our partners during this transition to ensure there is no impact on our ability to review content," a Meta spokesperson said.
The social media company has contracted Luxembourg-based, Majorel to replace Sama. Majorel previously worked as TikTok's content moderator in the Middle East and North Africa where its employees claimed that they were treated like robots, reviewing videos of suicide and animal cruelty for less than $3 an hour.
As of 2022, Sama's contract to review harmful content for Meta was worth $3.9 million in 2022, according to internal Sama documents reviewed by TIME. The decision to drop Meta's contract comes months after a lawsuit was filed by Daniel Motaung, a South African national and ex-Sama content moderator, in Kenya last year accusing the two firms of forced labour and human trafficking, unfair labour relations, union busting and failure to provide "adequate" mental health and psychosocial support.
Again, in East Africa, a lawsuit was filed by Ethiopian researchers Abrham Meareg and Fisseha Tekle, along with Kenyan human rights group Katiba Institute, supported by legal nonprofit Foxglove, in December 2022, accusing Facebook of playing a role in inciting violence during the Ethiopia civil war (also known as Tigray war).
The lawsuit which is calling for $2 billion in restitution describes Facebook's action as a "woeful failure to address violence on its platform" and its design that "promotes and prioritizes hateful, incitement and dangerous content".