Facebook Moderation Guidelines Leaked, Show How It Reviews Hate Speech, Extremist Content

Advertisement
By Reuters | Updated: 22 May 2017 12:10 IST
Highlights
  • Challenges such as revenge porn have overwhelmed Facebook moderators
  • It reviews around 6.5 million reports of potentially fake accounts a week
  • It confirmed that it was using software to intercept graphic content

Leaked Facebook documents show how the social media company moderates issues such as hate speech, terrorism, pornography and self-harm on its platform, the Guardian reported, citing internal guidelines seen by the newspaper.

New challenges such as "revenge porn" have overwhelmed Facebook's moderators who often have just ten seconds to make a decision, the Guardian said. The social media company reviews more than 6.5 million reports of potentially fake accounts a week, the newspaper added.

Advertisement

Many of the company's content moderators have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing, the Guardian said.

Facebook had no specific comment on the report but said safety was its overriding concern.

Advertisement

Facebook to Hire 3,000 Reviewers to Screen Out Violent Videos

"Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously", Facebook's Head of Global Policy Management Monica Bickert said in a statement.

Advertisement

Facebook confirmed that it was using software to intercept graphic content before it went on the website, but it was still in its early stages.

The leaked documents included internal training manuals, spreadsheets and flowcharts, the Guardian said.

Advertisement

The newspaper gave the example of Facebook policy that allowed people to live-stream attempts to self-harm because it "doesn't want to censor or punish people in distress."

Facebook moderators were recently told to "escalate" to senior managers any content related to "13 Reasons Why," the Netflix original drama series based on the suicide of a high school student, because it feared inspiration of copycat behavior, the Guardian reported.

Reuters could not independently verify the authenticity of the documents published on the Guardian website.

© Thomson Reuters 2017

 

Get your daily dose of tech news, reviews, and insights, in under 80 characters on Gadgets 360 Turbo. Connect with fellow tech lovers on our Forum. Follow us on X, Facebook, WhatsApp, Threads and Google News for instant updates. Catch all the action on our YouTube channel.

Advertisement

Related Stories

Popular Mobile Brands
  1. Vivo T5 Pro vs Oppo A6 Pro vs Lava Agni 4: Know What Is the Difference
  1. LEGO Friends: The Next Chapter Season 4 Now Streaming on Netflix: What You Need to Know
  2. Small NASA Satellite Could Reveal How Lightning Impacts Space Weather
  3. Piece by Piece: Pharrell Williams’ LEGO Documentary Now Streaming on Netflix
  4. Ustaad Bhagat Singh OTT Release: When & Where to Watch Pawan Kalyan’s Telugu Film Online
  5. Battleground Season 2 Now on OTT: Know Where to Watch This Ultimate Fitness Reality Show Online
  6. Apne Paraye Out on OTT: Know Where to Watch This Hindi Dub of Bengali Drama Series
  7. Scientists Just Created the Largest 3D Map of the Universe Ever to Study Dark Energy
  8. Honor 600 Pro and Honor 600 Key Specifications, Features Revealed via Official Listing
  9. Ethereum NFT Platform Shuts Down After Blacklove Sale Falls Through
  10. Vivo X300 FE Storage Options Leaked Alongside Live Image With Telephoto Extender Kit
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2026. All rights reserved.