Facebook-Parent Meta to 'Assess Feasibility' of Human Rights Review on Ethiopia Practices

Meta has been under scrutiny from lawmakers and regulators over user safety and its handling of abuses on its platforms.

Advertisement
By Reuters | Updated: 14 January 2022 13:50 IST
Highlights
  • Meta has been under scrutiny from lawmakers and regulators
  • Thousands have died and millions have been displaced
  • The board recommended that Meta rewrite its value statement

Facebook invested significant resources in Ethiopia to identify and remove potentially harmful content

Facebook owner Meta Platforms said on Thursday it would "assess the feasibility" of commissioning an independent human rights assessment into its work in Ethiopia, after its oversight board recommended a review of how Facebook and Instagram have been used to spread content that heightens the risk of violence there.

The board, set up by the company to address criticism over its handling of problematic material, makes binding decisions on a small number of challenging content moderation cases and provides non-binding policy recommendations.

Meta has been under scrutiny from lawmakers and regulators over user safety and its handling of abuses on its platforms across the world, particularly after whistleblower Frances Haugen leaked internal documents that showed the company's struggles in policing content in countries where such speech was most likely to cause harm, including Ethiopia.

Advertisement

Thousands have died and millions have been displaced during a year-long conflict between the Ethiopian government and rebellious forces from the northern Tigray region.

Advertisement

The social media giant said it has "invested significant resources in Ethiopia to identify and remove potentially harmful content," as part of its response to the board's December recommendations on a case involving content posted in the country.

The oversight board last month upheld Meta's original decision to remove a post alleging the involvement of ethnic Tigrayan civilians in atrocities in Ethiopia's Amhara region. As Meta had restored the post after the user's appeal to the board, the company had to again remove the content.

Advertisement

On Thursday, Meta said while it had taken the post down, it disagreed with the board's reasoning that it should have been removed because it was an "unverified rumor" that significantly increased the risk of imminent violence. It said this would impose "a journalistic publishing standard on people."

An oversight board spokesman said in a statement: "Meta's existing policies prohibit rumors that contribute to imminent violence that cannot be debunked in a meaningful timeframe, and the Board made recommendations to ensure these policies are effectively applied in conflict situations."

Advertisement

"Rumors alleging an ethnic group is complicit in atrocities, as found in this case, have the potential to lead to grave harm to people," they said.

The board had recommended that Meta commission a human rights due diligence assessment, to be completed in six months, which should include a review of Meta's language capabilities in Ethiopia and a review of measures taken prevent the misuse of its services in the country.

However, the company said not all elements of this recommendation "may be feasible in terms of timing, data science or approach." It said it would continue its existing human rights due diligence and should have an update on whether it could act on the board's recommenation within the next few months.

Reuters' previous reporting on Myanmar and other countries has investigated how Facebook struggled to monitor content across the world in different languages. In 2018, U.N. human rights investigators said the use of Facebook had played a key role in spreading hate speech that fueled violence in Myanmar.

Meta, which has said that it was too slow to prevent misinformation and hate in Myanmar, has said that the company now has native speakers worldwide reviewing content in more than 70 languages which work to stop abuse on its platforms in places where there is a heightened risk of conflict and violence.

The board also recommended that Meta rewrite its value statement on safety to reflect that online speech can pose a risk to the physical security of persons and their right to life. The company said it would make changes to this value, in a partial implementation of the recommendation.

© Thomson Reuters 2022


Xiaomi India speaks exclusively to Orbital, the Gadgets 360 podcast, on their plans for 2022 and pushing for 120W fast charging with the 11i HyperCharge. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated - see our ethics statement for details.
 

Catch the latest from the Consumer Electronics Show on Gadgets 360, at our CES 2026 hub.

Further reading: Facebook, Meta
Advertisement

Related Stories

Popular Mobile Brands
  1. CNAP vs Truecaller: Which Is Better at Identifying Spam Calls?
  2. Samsung Galaxy S26 Series Roundup: Everything That We Know So Far
  1. Quantum Haloscope Sharpens the Search for Dark Matter Axions at Higher Frequencies
  2. Rare Interstellar Object 3I/ATLAS Fails Alien Test, Scientists Say
  3. CNAP vs Truecaller: How India’s Official Caller ID System Differs From the Popular App
  4. Prayagraj Ki Love Story Set to Stream Soon on Hungama OTT
  5. Mask OTT Release Date: When and Where to Watch This Action-Packed Thriller Online?
  6. New Year 2026 Custom Greetings: 5 Best AI Prompts for ChatGPT, Gemini, and Other AI Tools
  7. NASA’s Chandra Spots Champagne Cluster Formed by a Massive Galaxy Collision
  8. NASA’s Curiosity Rover Sends Stunning Sunrise-and-Sunset Holiday Postcard from Mars
  9. Oppo Find X9s Key Specifications Leaked Again; Might Also Launch in India
  10. Redmi Turbo 5, Redmi Turbo 5 Pro to Be Equipped With Upcoming MediaTek Dimensity Chips, Tipster Claims
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2026. All rights reserved.