The chief executive of a company contracted to moderate Facebook posts in east Africa has said she regretted taking on the work, after its staff said they were left traumatised by graphic content on the social media platform.
The US outsourcing firm Sama is facing a number of legal cases brought by Kenya-based employees, who alleged being exposed to graphic and traumatic content such as videos of beheadings, suicide and other material at a moderation hub.
Sama’s chief executive, Wendy Gonzalez, told the BBC that the company, which has an office in Nairobi, would no longer take on work that involved moderating harmful content.
“You ask the question: ‘Do I regret it?’ Well, I would probably put it this way. If I knew what I know now, which included all of the opportunity, energy it would take away from the core business I would have not entered [the agreement],” she said.
Sama began moderating the Facebook material in 2019 and Gonzalez said it never represented more than 4% of its overall work.
A Kenyan court ruled in February this year that a case brought against Facebook by a former Sama content moderator could go ahead.
Daniel Motaung, who was hired as a Facebook content moderator by Sama in 2019, filed a lawsuit against the two companies last year, alleging he had been exposed to graphic and traumatic content at work without adequate prior knowledge or proper psychosocial support – which he said left him with post-traumatic stress disorder.
He also claimed he was unfairly dismissed after trying to unionise his co-workers to fight for better conditions.
Facebook’s parent company, Meta, contested its inclusion as a party in the case. The firm argued that Sama was Motaung’s employer and that Meta could not be subjected to a hearing in Kenyan courts because it was neither registered nor operated in the country.
Meta also said it required all companies it worked with to provide round-the-clock support, while Sama claimed that certified wellness counsellors were always on hand for its employees.
About 260 screeners in Facebook’s moderation hub in Nairobi were made redundant this year as the US tech company switched its moderation provider to the European firm Majorel.
But moderators claim they were given “varying” and “confusing” reasons for the mass layoffs, and believe it was an effort to suppress growing worker complaints over low pay and lack of mental health support.
A Kenyan employment court heard testimony of the traumatic nature of the moderators’ day-to-day work. “I remember my first experience witnessing manslaughter on a live video … I unconsciously stood up and screamed. For a minute, I almost forgot where I was and who I was. Everything went blank,” read one of the written testimonies.
Frank Mugisha, 33, a moderator from Uganda, told the Guardian: “I’ve seen stuff that you’ve never seen, and I’d never wish for you to see.”