Facebook Accused of Ignoring Ethiopian Partners’ Warnings on Hateful Content

In Ethiopia, six local partners working with Facebook to flag hateful content claim that the social media giant routinely ignored their pleas to remove dangerous posts. These partners reported content they deemed hateful or likely to incite violence, but in some cases, it took Facebook four months to respond. This issue came to light following the 2021 murder of an Ethiopian professor, Meareg Amare, whose son and one of Facebook’s trusted partners warned the company about threatening posts prior to the killing.

On October 9, 2021, an unofficial Facebook page for Bahir Dar University staff featured a post about Amare. The user shared the professor’s photo with a caption stating: “His name is Professor Meareg Amare Abrha. He is Tigrayan. We will tell you about how he was hiding at Bahir Dar University and carried out abuses.”

Facebook relies on designated trusted partners with linguistic and local expertise to report posts and accounts. However, partners in Ethiopia claim that Facebook allowed hateful content to remain on its platform and was slow to respond to urgent warnings. The accusations echo previous allegations made in a lawsuit filed against Meta, such as Myanmar’s Rohingya Muslim crisis in 2017, which calls for changes to the company’s content-moderation practices.

The Shortcomings of Meta’s Trusted Partner Program in Ethiopia

With 7 million Facebook accounts in the country, social media platforms like Facebook often become the primary source of information and discussion in Ethiopia. However, misinformation and hate speech are widespread. Meta’s Trusted Partner program aims to combat such issues, but its effectiveness has been questioned.

The Network Against Hate Speech, an all-volunteer digital rights group, began working with Meta in 2021. However, they faced numerous challenges, including unresponsiveness, disorganization, and security concerns. Despite reporting over 367 posts and individual accounts promoting violence in October 2021, many of these posts remained online even as of April 2023.

The Network Against Hate Speech ultimately ended its relationship with Meta on March 21, 2023, questioning Meta’s reliance on under-resourced social organizations instead of directly employing experts. Critics argue that the Trusted Partner program is merely a “Band-Aid solution” to a problem that requires a more comprehensive approach.

The Psychological Impact of Confronting Hate Speech for Ethiopian Partners

Facebook’s trusted partners in Ethiopia have experienced significant psychological impacts while working to combat the platform’s hate speech and graphic content. One partner reported depression and difficulty engaging in everyday activities due to the disturbing content they encountered. Another expressed guilt and anger after the death of Professor Amare, who was killed following flagged posts.

Meta, formerly Facebook, faces challenges in identifying and mitigating coded language and calls for violence on its platform, and while it employs various mechanisms to address hate speech, its efforts have often been insufficient.

Despite Meta’s denial of its algorithm contributing to the spread of hateful content, whistleblower Frances Haugen testified that it fueled ethnic violence in Ethiopia.

The company’s investments in growth in the global south have not been matched by investments in protective tools, leaving users vulnerable to harmful content.


This blog post is based on an article that originally appeared on Business Insider.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply