Kenyan court clears the path for lawsuit claiming Facebook’s involvement in escalating Ethiopia’s Tigray conflict.
Ethiopians suing Meta for failing to adequately moderate content that amplified violence that left over half-a-million people dead during the Tigray War have been granted the go-ahead to serve the social media giant outside Kenya. This is the latest case that seeks to compel Facebook to stop amplifying violent, hateful and inciteful posts.
A Kenyan court on Thursday granted petitioners leave to serve Meta in California, U.S., after they failed to trace the social media giant’s office locally. It emerged that while Meta has business operations in Kenya, it doesn’t have a physical office, as its local employees work remotely.
The decision lays the groundwork for the beginning of a lawsuit filed in December last year by Kenyan rights group Katiba Institute, and Ethiopian researchers Fisseha Tekle and Abrham Meareg. Meareg’s father, professor Meareg Amare Abrha, was killed during the Tigray War after posts on Facebook doxed and called for violence against him, the lawsuit alleges.
The petitioners seek to compel Meta to stop viral hate on Facebook, ramp up content review at the moderation hub in Kenya, and to create a $1.6 billion compensation fund.
The petitioners allege that Facebook’s algorithm amplified hateful and inciteful posts that drew more interactions and kept users logged in for longer.
They claim Facebook “under-invested” in human content review at the hub in Kenya, risking lives as it ignored, rejected or acted sluggishly to take down posts that also violated its community standards.
Meareg said his family has firsthand experience of how flawed content moderation could endanger lives, and break up families.
He claims his father was murdered after Meta failed to act on repeated requests to take down posts that targeted him and other Tigrayans, as calls for massacre against the ethnic group spread online and offline. The Tigray War, which lasted two years, erupted in November 2020 after the Ethiopian army clashed with Tigray forces, leaving 600,000 people dead.
“My father was killed because posts published on Facebook identified him, accused him falsely, leaked the address of where he lives and called for his death,” said Meareg, a former PhD student, adding that he was forced to flee the country and seek asylum in the U.S. after his father’s death.
“My father’s case is not an isolated one. Around the time of the posts and his death, Facebook was saturated with hateful, inciteful and dangerous posts…many other tragedies like ours have taken place,” he said.
Meta declined to comment.
Meareg says he reported the posts he came across, but his reports were either rejected or ignored. He claims to have reported several posts in 2021, including one that contained dead bodies, and some of those posts were still on the social site by the time he went to court last December.
He faulted Facebook’s content review, saying the hub in Kenya had only 25 moderators responsible for Amharic, Tigrinya and Oromo content, which left out 82 other languages without personnel to moderate.
Meta previously told TechCrunch that it employed teams and technology to help it remove hate speech and incitement, and that it had partners and staff with local knowledge to help it develop methods to catch violating content.
“A flaw has been allowed to grow within Facebook, transforming it into a weapon for spreading hatred, violence and even genocide,” said Martha Dark, director of Foxglove, a tech justice NGO supporting the case. “Meta could take real action, today, to pull the plug on hatred spreading across Facebook.”
This is not the first time Meta is being accused of fueling violence in Ethiopia. Whistleblower Frances Haugen previously accused it of “literally fanning ethnic violence” in Ethiopia, and a Global Witness investigation also noted that Facebook was poor at detecting hate speech in the main language of Ethiopia.
Currently, social media platforms including Facebook remain blocked in Ethiopia since early February after state-led plans to split Ethiopian Orthodox Tewhado Church caused anti-government protests.
Adding to Meta’s troubles in Kenya
Meta is facing three lawsuits in Kenya.
The company and its content review partner in sub-Saharan Africa, Sama, were sued in Kenya last May for exploitation and union busting by Daniel Motaung, a former content moderator.
Motaung claimed to have been fired by Sama for organizing a 2019 strike that sought to unionize Sama’s employees. He was suing Meta and Sama for forced labor, exploitation, human trafficking, unfair labor relations, union busting and failure to provide “adequate” mental health and psychosocial support.
Meta sought to have its name struck off the suit, saying Motaung was not its employee, and that the Kenyan court had no jurisdiction over it. However, it failed to stop the lawsuit after the court ruled that it had a case to answer, as some aspects of how the company operates in the country make it liable. The social media giant has appealed the court’s decision.
Earlier this month, Meta was sued alongside Sama and another content review partner, Majorel, by 183 content moderators who alleged they were laid off unlawfully and blacklisted. The moderators claimed they were fired by Sama unlawfully after it wound down its content review arm, and that Meta instructed its new Luxembourg-based partner, Majorel, to blacklist ex-Sama content moderators.
Meta sought to be struck out of this case as well, but last week, the Kenyan court said it had jurisdiction over employer-employee disputes and “matters of alleged unlawful and unfair termination of employment on grounds of redundancy” and that it had power “to enforce alleged violation of human rights and fundamental freedoms” by Meta, Sama and Majorel.