A Facebook search for the words “election fraud” first delivers an article claiming that workers at a Pennsylvania children's museum are brainwashing children so they'll accept stolen elections.
Facebook's second suggestion? A link to an article from a site called MAGA Underground that says Democrats are plotting to rig next month's midterms. “You should still be mad as hell about the fraud that happened in 2020," the article insists.
With less than three weeks before the polls close, misinformation about voting and elections abounds on social media despite promises by tech companies to address a problem blamed for increasing polarisation and distrust.
While platforms like Twitter, TikTok, Facebook and YouTube say they've expanded their work to detect and stop harmful claims that could suppress the vote or even lead to violent confrontations, a review of some of the sites shows they're still playing catchup with 2020, when then-President Donald Trump's lies about the election he lost to Joe Biden helped fuel an insurrection at the US Capitol.
“You would think that they would have learned by now,” said Heidi Beirich, founder of the Global Project Against Hate and Extremism and a member of a group called the Real Facebook Oversight Board that has criticized the platform's efforts. “This isn't their first election. This should have been addressed before Trump lost in 2020. The damage is pretty deep at this point.”
If these US-based tech giants can't properly prepare for a US election, how can anyone expect them to handle overseas elections, Beirich said.
Mentions of a "stolen election" and “voter fraud” have soared in recent months and are now two of the three most popular terms included in discussions of this year's election, according to an analysis of social media, online and broadcast content conducted by media intelligence firm Zignal Labs on behalf of The Associated Press.
On Twitter, Zignal's analysis found that tweets amplifying conspiracy theories about the upcoming election have been reposted many thousands of times, alongside posts restating debunked claims about the 2020 election.
Most major platforms have announced steps intended to curb misinformation about voting and elections, including labels, warnings and changes to systems that automatically recommend certain content. Users who consistently violate the rules can be suspended. Platforms have also created partnerships with fact-checking organizations and news outlets like the AP, which is part of Meta's fact-checking program.
“Our teams continue to monitor the midterms closely, working to quickly remove content that violates our policies," YouTube said in a statement. “We'll stay vigilant ahead of, during, and after Election Day.”
Meta, the owner of Facebook and Instagram, announced this week that it had reopened its election command center, which oversees real-time efforts to combat misinformation about elections. The company dismissed criticism that it's not doing enough and denied reports that it has cut the number of staffers focused on elections.
“We are investing a significant amount of resources, with work spanning more than 40 teams and hundreds of people,” Meta said in a statement emailed to the AP.
The platform also said that starting this week, anyone who searches on Facebook using keywords related to the election, including “election fraud,” will automatically see a pop-up window with links to trustworthy voting resources.
TikTok created an election center earlier this year to help voters in the U.S. learn how to register to vote and who's on their ballot. The information is offered in English, Spanish and more than 45 other languages. The platform, now a leading source of information for young voters, also adds labels to misleading content.
“Providing access to authoritative information is an important part of our overall strategy to counter election misinformation,” the company said of its efforts to prepare for the midterms.
But policies intended to stop harmful misinformation about elections aren't always enforced consistently. False claims can often be buried deep in the comments section, for instance, where they nonetheless can leave an impression on other users.
A report released last month from New York University faulted Meta, Twitter, TikTok and YouTube for amplifying Trump's false statements about the 2020 election. The study cited inconsistent rules regarding misinformation as well as poor enforcement.
Concerned about the amount of misinformation about voting and elections, a number of groups have urged tech companies to do more.
“Americans deserve more than lip service and half-measures from the platforms,” said Yosef Getachew, director of Common Cause's media and democracy program. “These platforms have been weaponized by enemies of democracy, both foreign and domestic.”
Election misinformation is even more prevalent on smaller platforms popular with some conservatives and far-right groups like Gab, Gettr and TruthSocial, Trump's own platform. But those sites have tiny audiences compared with Facebook, YouTube or TikTok.
Beirich's group, the Real Facebook Oversight Board, crafted a list of seven recommendations for Meta intended to reduce the spread of misinformation ahead of the elections. They included changes to the platform that would promote content from legitimate news outlets over partisan sites that often spread misinformation, as well as greater attention on misinformation targeting voters in Spanish and other languages.
Meta told the AP it has expanded its fact-checking network since 2020 and now has twice as many Spanish-language fact checkers. The company also launched a Spanish-language fact-checking tip line on WhatsApp, another platform it owns.
Much of the misinformation aimed at non-English speakers seems aimed at suppressing their vote, said Brenda Victoria Castillo, CEO of the National Hispanic Media Coalition, who said that the efforts by Facebook and other platforms aren't equal to the scale of the problem posed by misinformation.
“We are being lied to and discouraged from exercising our right to vote,” Castillo said. "And people in power, people like (Meta CEO) Mark Zuckerberg are doing very little while they profit from the disinformation.”
Affiliate links may be automatically generated - see our ethics statement for details.