YouTube to Tweak Algorithms to Stop Recommending Conspiracy Theory Videos

Advertisement
By Elizabeth Dwoskin, The Washington Post | Updated: 28 January 2019 14:27 IST
Highlights
  • YouTube will retool its recommendation algorithm that suggests new videos
  • The change to algorithms is the result of a six-month long effort
  • YouTube said it would apply to less than one percent of the content

YouTube said Friday it is retooling its recommendation algorithm that suggests new videos to users in order to prevent promoting conspiracies and false information, reflecting a growing willingness to quell misinformation on the world's largest video platform after several public missteps.

In a blog post that YouTube published Friday, the company said that it was taking a "closer look" at how it can reduce the spread of content that "comes close to - but doesn't quite cross the line" of violating its rules. YouTube has been criticised for directing users to conspiracies and false content when they begin watching legitimate news.

The change to the company's so-called recommendation algorithms is the result of a six-month long technical effort. It will be small at first - YouTube said it would apply to less than one percent of the content of the site - and only affects English-language videos, meaning that much unwanted content will still slip through the cracks.

Advertisement

The company stressed that none of the videos would be deleted from YouTube. They would still be findable for people who search for them or subscribe to conspiratorial channels.

Advertisement

"We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," the blog post said.

YouTube, which has historically given wide latitude to free speech concerns, does not prohibit conspiracy theories or other forms of false information. The company does ban hate speech, but defines it somewhat narrowly as speech that promotes violence or hatred of vulnerable groups.

Advertisement

Advocates say those policies don't go far enough to prevent people from being exposed to misleading information, and that the company's own software often pushes people to the political fringes by feeding them extremist content that they did not seek out.

YouTube's recommendation feature suggests new videos to users based on the videos they previously watched. The algorithm takes into account "watch time" - or the amount of time people spend watching a video -- and the number of views as factors in the decision to suggest a piece of content. If a video is viewed many times to the end, the company's software may recognize it was a quality video and automatically start promoting it to others. Since 2016, the company has also incorporated satisfaction, likes, dislikes, and other metrics into its recommendation systems.

Advertisement

But from a mainstream video, the algorithm often takes a sharp turn to suggest extremist ideas. The Washington Post reported in December that Youtube continues to recommend hateful and conspiratorial videos that fuel racist and anti-Semitic content.

More recently YouTube has developed software to stop conspiracy theories from going viral during breaking news events. In the aftermath of the Parkland school shooting last February, a conspiracy theory claiming that a teenage survivor of the school shooting was a so-called "crisis actor" was the top trending item on YouTube. In the days following the October 2017 massacre in Las Vegas, videos claiming the shooting was a hoax garnered millions of views.

YouTube's separate search feature has also been called out for promoting conspiracies and false content. Earlier this month, for instance, a search for RBG, the initials of Supreme Court Justice Ruth Bader Ginsburg, returned a high number of far-right videos peddling conspiracies - and little authentic content related to the news that she was absent from the court while recovering from surgery.

Six months ago, YouTube began to recruit human evaluators who were asked to review content based on a set of guidelines. The company then took the feedback of the evaluators and used it to train algorithms that generate recommendations.

© The Washington Post 2019

 

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Youtube
Advertisement

Related Stories

Popular Mobile Brands
  1. Samsung Galaxy S25 FE Accessories Leaked Ahead of September 4 Launch
  2. Paramount and Activision Sign Deal to Bring Call of Duty to Big Screen
  3. iPhone 17 Pro Max Redesigned Camera Module, Foldable iPhone Timeline Leaked
  4. Apple iPhone 17 and iPhone 17 Pro: Expected Features, Specs, and Price
  1. Scientists Create Stretchy Rubber That Converts Body Heat Into Electricity for Wearables
  2. NASA’s InSight Reveals Ancient Planetary Remains Preserved Deep Inside Mars
  3. Rajinikanth’s Coolie is Coming to OTT Platforms Soon: Know When, Where to Watch it Online
  4. NASA’s Juno Spacecraft Detects Callisto’s Aurora, Completing Jupiter’s Galilean Moons Set
  5. Kalyani Priyadarshan’s Lokah Chapter 1: Chandra OTT Release Date Revealed
  6. Astronomers Discover Calvera, a Runaway Pulsar Racing Above the Milky Way
  7. Itel A90 Limited Edition Launched in India With MIL-STD-810H Durability: Price, Specifications
  8. OKX Faces EUR 2.25 Million Fine By Dutch National Bank for Operating Without Registration
  9. NASA’s OSIRIS-REx Mission Finds Stardust in Asteroid Bennu Older Than the Solar System
  10. Swiggy and Zomato Raise Platform Fees to Up to Rs. 15 Amidst Rise in Festival-Related Demand
Gadgets 360 is available in
Download Our Apps
Available in Hindi
© Copyright Red Pixels Ventures Limited 2025. All rights reserved.