YouTube to Stop Removing Content Spreading Misinformation on Past Elections as Part of New Policy

The new set of updates is part of YouTube's elections misinformation policy that will go into effect immediately.

YouTube to Stop Removing Content Spreading Misinformation on Past Elections as Part of New Policy

Photo Credit: Reuters

YouTube's new elections misinformation policy will go into effect immediately

Highlights
  • YouTube's elections misinformation policy will curb the misinformation
  • Its policies against hate speech and harassment will also be applied
  • Other social media platforms too have seen a spike in disinformation
Advertisement

Alphabet's YouTube said on Friday that the platform would stop removing content that might have spread false claims related to US presidential elections in 2020 and before. The new set of updates is part of YouTube's elections misinformation policy that will go into effect immediately.

"In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech," YouTube said in a blog post. The platform also said the rest of its policies against hate speech, harassment, and incitement to violence would continue to apply to all user content, including elections. The proliferation of disinformation has raised questions about how social media platforms enforce their policies against misleading content about elections.

Other social media platforms like Twitter and Meta Platform's Facebook have also seen a spike in disinformation related to elections.

In March, YouTube lifted restrictions on former US President Donald Trump's channel, following more than two-year suspension after the deadly Capitol Hill riot on January 6, 2021.

"We carefully evaluated the continued risk of real-world violence, while balancing the chance for voters to hear equally from major national candidates in the run up to an election," YouTube said in a tweet, referring to the move.

The video-streaming platform banned Trump in 2021 for violating its policy of inciting violence after his supporters stormed the US Capitol when Congress began to certify Joe Biden's victory in the presidential election.

In the same month, the US Federal Trade Commission (FTC) issued orders to eight social media and video streaming firms including Meta Platforms, Twitter, TikTok, and YouTube seeking information on how the platforms screen for misleading advertisements.

© Thomson Reuters 2023


Apple's annual developer conference is just around the corner. From the company's first mixed reality headset to new software updates, we discuss all the things we're looking forward to seeing at WWDC 2023 on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated - see our ethics statement for details.
Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Alphabet, YouTube, Google, YouTube ad
Twitter's Head of Brand Safety and Ad Quality to Leave Company: Details
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »