Business

Ethiopian Rebels' Threats Ignored by Meta Contractor: Court Documents Reveal

Content Moderators' Safety at Risk

In a new development, a contractor hired by Facebook's parent company Meta has been accused of dismissing threats to content moderators by Ethiopian rebels, according to court documents filed in a case challenging the dismissal of dozens of moderators in Kenya.

Last year, 185 content moderators sued Meta and two contractors, alleging they lost their jobs with Sama, a Kenya-based firm contracted to moderate Facebook content, for trying to organize a union. They further claimed to have been blacklisted from applying for the same roles at another firm, Majorel, after Facebook changed contractors.

Moderators focusing on Ethiopia reported being targeted by members of the Oromo Liberation Army (OLA) rebel group for removing their videos. However, their complaints were dismissed by Sama, as revealed by court documents filed on December 4 by Foxglove, a British non-profit supporting the moderators' case.

Sama accused the moderators of "creating a false account and manufacturing" the threatening messages, before eventually agreeing to an investigation and sending one of the moderators who was publicly identified by the rebels to a safehouse. Sama told Reuters it was unable to comment on the allegations. Spokespeople for Meta and OLA did not respond to requests for comment.

Fear and Dismissal

One moderator stated in his affidavit that he had received a message from OLA threatening "content moderators who were constantly pulling down their graphic Facebook Posts". He added that his supervisor dismissed his concerns. Another moderator said in his affidavit that he received a message from OLA listing his and his colleagues' names and addresses, leading to a life filled with fear of visiting his family members in Ethiopia.

Impact on Meta's Policy

The court documents also said that Meta ignored advice from experts it hired to tackle hate speech in Ethiopia. One expert, who supervised dozens of moderators, said in an affidavit that she felt "stuck in an endless loop of having to review hateful content that we were not allowed to take down because it technically did not offend Meta policies".

Out-of-court settlement talks between the moderators and Meta collapsed in October last year. The case could have implications for how Meta works with content moderators globally.