Meta's moderation failures incite hate and human rights abuses, according to Amnesty International

The human rights organization is pressing Meta to compensate victims of Ethiopia's Tigray civil conflict.
By Chase DiBenedetto  on 
Meta and Facebook logos
Amnesty International stands behind a group of petitioners suing Meta for its role in exacerbating a civil war in Ethiopia. Credit: Lionel Bonaventure/AFP via Getty Images

Meta, and its platform Facebook, are facing continued calls for accountability and reparations following accusations that its platforms can exacerbate violent global conflicts.

The latest push comes in the form of a new report by human rights organization Amnesty International, which looked into Meta's content moderation policies during the beginnings of an ongoing conflict in Ethiopia's Tigray region and the company's failure to respond to civil society actors calling for action before and during the conflict.

Released on Oct. 30, the report — titled "A Death Sentence For My Father": Meta's Contribution To Human Rights Abuses in Northern Ethiopia — narrows in on the social media mechanisms behind the Ethiopian armed civil conflict and ethnic cleansing that broke out in the northern part of the country in Nov. 2020. More than 600,000 civilians were killed by battling forces aligned with Ethiopia's federal government and those aligned with regional governments. The civil war later spread to the neighboring Amhara and Afar regions, during which time Amnesty International and other organizations documented war crimes, crimes against humanity, and the displacement of thousands of Ethiopians.

"During the conflict, Facebook (owned by Meta) in Ethiopia became awash with content inciting violence and advocating hatred," writes Amnesty international. "Content targeting the Tigrayan community was particularly pronounced, with the Prime Minister of Ethiopia, Abiy Ahmed, pro-government activists, as well as government-aligned news pages posting content advocating hate that incited violence and discrimination against the Tigrayan community."

The organization argues that Meta's "surveillance-based business model" and algorithm, which "privileges ‘engagement’ at all costs" and relies on harvesting, analyzing, and profiting from people’s data, led to the rapid dissemination of hate-filled posts. A recent report by the UN-appointed International Commission of Human Rights Experts on Ethiopia (ICHREE) also noted the prevalence of online hate speech that stoked tension and violence.

Amnesty International has made similar accusations of the company for its role in the targeted attacks, murder, and displacement of Myanmar's Rohingya community, and claims that corporate entities like Meta have a legal obligation to protect human rights and exercise due diligence under international law.

Mashable Light Speed
Want more out-of-this world tech, space and science stories?
Sign up for Mashable's weekly Light Speed newsletter.
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

In 2022, victims of the Ethiopian war filed a lawsuit against Meta for its role in allowing inflammatory posts to remain on its social platform during the active conflict, based on an investigation by the Bureau of Investigative Journalism and the Observer. The petitioners allege that Facebook’s recommendations systems amplified hateful and violent posts and allowed users to post content inciting violence, despite being aware that it was fueling regional tensions. Some also allege that such posts led to the targeting and deaths of individuals directly.

Filed in Kenya, where Meta’s sub-Saharan African operations are based, the lawsuit is supported by Amnesty International and six other organizations, and calls on the company to establish a $1.3 billion fund (or 200 billion Kenyan shillings) to compensate victims of hate and violence on Facebook.

In addition to the reparations-based fund, Amnesty International is also calling for Meta to expand its content moderation and language capabilities in Ethiopia, as well as a public acknowledgment and apology for contributing to human rights abuses during the war, as outlined in its recent report.

The organization's broader recommendations also include the incorporation of human rights impact assessments in the development of new AI and algorithms, an investment in local language resources for global communities at risk, and the introduction of more "friction measures" — or site design that makes the sharing of content more difficult, like limits on resharing, message forwarding, and group sizes.

Meta has previously faced criticism for allowing unchecked hate speech, misinformation, and disinformation to spread on its algorithm-based platforms, most notably during the 2016 and 2020 U.S. presidential elections. In 2022, the company established a Special Operations Center to combat the spread of misinformation, remove hate speech, and block content that incited violence on its platforms during the Russian invasion of Ukraine. It's deployed other privacy and security tools in regions of conflict before, including a profile lockdown tool for users in Afghanistan launched in 2021.

Additionally, the company has recently come under fire for excessive moderation, or "shadow-banning", of accounts sharing information during the humanitarian crisis in Gaza, as well as fostering harmful stereotypes of Palestinians through inaccurate translations.

Amid ongoing conflicts around the world, including continued violence in Ethiopia, human rights advocates want to see tech companies doing more to address the quick dissemination of hate-filled posts and misinformation.

"The unregulated development of Big Tech has resulted in grave human rights consequences around the world," Amnesty International writes. "There can be no doubt that Meta’s algorithms are capable of harming societies across the world by promoting content that advocates hatred and which incites violence and discrimination, which disproportionately impacts already marginalized communities."

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads "Ricas's Tostadas" in red lettering.
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable's Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also touches on how these conversations manifest in politics, popular culture, and fandom. Sometimes she's very funny.


Recommended For You
Content moderation in Trump's America is a political minefield
President Donald Trump over a map of the country with images of content moderation floating above.

Meta Quest 3S announced, adds cheaper headset to Meta line
a man putting on a meta quest 3s headset


Meta Connect 2024: Meta’s Orion AR glasses unveiled
Pair of augemented glasses being held by hands

Ray-Ban Meta Glasses can be used to dox strangers via facial recognition, according to Harvard students. Here's how to protect yourself.
I-XRAY facial recognition project

More in Tech
How to watch Dallas Cowboys vs. Carolina Panthers online
The end zone on a football field

How to watch Miami Dolphins vs. Houston Texans online
The end zone on a football field

How to watch Baltimore Ravens vs. New York Giants online
An NFL football field.

How to watch New York Jets vs. Jacksonville Jaguars online
The end zone on a football field

How to watch Washington Commanders vs. New Orleans Saints online
A football on a field

Trending on Mashable
NYT Connections hints today: Clues, answers for December 15, 2024
A phone displaying the New York Times game 'Connections.'

Wordle today: Answer, hints for December 15
a phone displaying Wordle

NYT Connections hints today: Clues, answers for December 16, 2024
A phone displaying the New York Times game 'Connections.'


The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!