Meta, the parent company of Facebook and Instagram, is facing a $2.4 billion lawsuit over allegations that its platform contributed to ethnic violence in Ethiopia. A Kenyan high court has ruled that the case, brought by two Ethiopian nationals and a Kenyan NGO, can proceed.
Hateful content contributing to real-word harm
The lawsuit was filed by two Ethopians, Abrham Meareg and Fisseha Tekle, and The Katiba Institute, a Kenya-based NGO. They argue that Facebook’s algorithms amplified hate speech and inciteful content, fueling violence during the country’s civil war. They claim the platform’s design allowed dangerous content to go viral, ultimately leading to grave human rights violations.
Meareg is the son of Professor Meareg Amare Abrha, who was murdered at his home in Ethiopia after his address and threatening posts were posted on Facebook during a civil war in the country in 2021. Fisseha Tekle, a former researcher at Amnesty International, wrote reports on violence committed during the conflict in Tigray in northern Ethiopia and received death threats on Facebook.
Landmark case gets the go-ahead
Meta argues that courts in Kenya, where Facebook’s Ethiopia content moderators were based at the time, did not have authority over the case. However, the Kenyan high court in Nairobi ruled that the case fell within the jurisdiction of the country’s courts.
“I am grateful for the court’s decision today,” Meareg said in a statement. “It is disgraceful that Meta would argue that they should not be subject to the rule of law in Kenya. African lives matter.”
“Meta cannot undo the damage it has done, but it can radically change how it moderates dangerous content across all its platforms to make sure no-one else has to go through what I have,” Tekle added.
Demands for Accountability and Reform
The lawsuit demands that Meta create a restitution fund for victims of hate and violence and changes its algorithms and moderation practices to stop the spread of viral hate speech. Meta is also being asked to take steps to demote incitements to violence on its platforms and hire enough content moderators to avoid further damage in East and Southern Africa.
Major human rights organizations have supported the case, including Amnesty International, Global Witness, Article 19, the Kenyan Human Rights Commission, and Kenya’s National Integration and Cohesion Commission.
Meta’s broader content moderation battles
This isn’t the first time Meta has been taken to court in Kenya. In 2023, a similar ruling allowed 185 Facebook content moderators to sue the company over claims of unlawful termination and poor working conditions.
In January 2025, Meta also faced backlash after announcing plans to phase out its fact-checking program, replacing it with a “community notes” system similar to the one used by X. Campaigners against hate speech online criticized Meta’s shift as an attempt to shrug off responsibility for managing hate speech and disinformation on its platforms and raised concerns about the impact on people of color and other marginalized groups.
Image: Foxglove
#blacktech #entrepreneur #tech #afrotech #womenintech #supportblackbusiness #blackexcellence #technology #blackbusiness #blacktechmatters #blackowned #blackgirlmagic #blackpreneur #startup #innovation #hbcu #techtrap #blackownedbusiness #pitchblack #autographedmemories #blacksintech #shopblack #wocintech #nba #blackwomen #repost #hbcubuzz #blackwomenintech #startupbusiness #nails
Source link