TikTok Faces Another Test: Its First US Presidential Election

Originally known for teens' viral dance routines and prank videos, Tiktok is now increasingly a destination for political content from its users.

TikTok Faces Another Test: Its First US Presidential Election

Misinformation experts remain concerned about the difficulties of moderating TikTok's many-layered videos

Highlights
  • TikTok has about 100 million monthly active US users
  • TikTok's fact-checking partners are Lead Stories and PolitiFact
  • Unlike FB & Twitter, TikTok does not flag any misinformation to its users
Advertisement

TikTok, already under scrutiny over its Chinese ownership and threatened with a possible ban by US President Donald Trump, is facing another major challenge: how to handle content around its first US presidential election.

Originally known for teenagers' viral dance routines and prank videos, Tiktok is now increasingly a destination for political content from its users. The hashtags Trump2020 and Biden2020 have collectively had more than 12 billion views on the app.

But TikTok's head of US safety Eric Han, in the first interview he has given about TikTok's approach to election misinformation, told Reuters his team's goal is to ensure the app can stay a place for entertainment and "silly self-expression."

TikTok, which says it has about 100 million monthly active US users, is charting its own approach to election-related material, factoring in what Han called the "cautionary tales" of more-established social media rivals.

TikTok fact-checking partners Lead Stories and PolitiFact said they have reviewed hundreds of videos containing political misinformation on the app, such as that Democratic vice presidential candidate Kamala Harris had threatened revenge on Trump supporters or about who appeared on disgraced financier Jeffrey Epstein's flight logs.

But unlike Facebook and Twitter, TikTok does not flag any misinformation to its users. Instead, the social media app keeps fact-checkers' assessments internal and uses them to remove content, or, less frequently, reduce its reach.

"A lot of us have come from other platforms, we've seen how fact-checking works, we've seen how labeling works," Han said, adding that the company was "very well aware" that fact-checking labels could backfire by making users double-down on inaccurate beliefs or assume all unlabeled content was legitimate.

Social media companies came under pressure to combat misinformation after US intelligence agencies determined Russia used such platforms to interfere in the 2016 election, which Moscow has denied. Facebook, which uses ratings from fact-checkers, including Reuters, to publicly label posts and reduce their distribution, said warnings on COVID-19 misinformation deterred users from viewing flagged content on 95 percent of the time.

TikTok, which does not accept political ads, says it does not allow misinformation that could cause harm, including content that misleads users about elections. It has also banned synthetic media, such as a recent video of House Speaker Nancy Pelosi manipulated to make her seem drunk.

"Their whole mission was to bring joy," said David Ryan Polgar, a tech ethicist and member of TikTok's new content advisory council that helps it shape policies. "But with anything that is popular, you're going to have somebody who is going to say 'how can I exploit popularity?'"

To combat such exploitation before and after the election, Han said TikTok staff are meeting weekly to plan for scenarios, from contested election results to disinformation campaigns by "state foreign actors...or a kid in someone's basement."

Members of TikTok's content advisory council also told Reuters that in a meeting last week they discussed issues including voter suppression and whether public supporters of the unfounded political conspiracy theory QAnon should be allowed on the platform, as well as what to do if the app is used to spread misinformation about contested results or incite post-election violence.

"Even if it's not organic to TikTok, it's going to end up there," said Hany Farid, a digital forensics expert and council member, who said he hoped TikTok's election policies would become "clearer."

TikTok's fact-checkers, who also partner with Facebook, said political falsehoods found on TikTok were similar to those spread on Zuckerberg's platform: "It's not just dance challenges any more," said Alan Duke, co-founder of fact-checking partner Lead Stories.

Even as TikTok grapples with content around the US election issues, the fate of the ByteDance-owned app in the country remains uncertain: the Trump administration is expected to make a decision soon on a proposed deal with Oracle, a plan orchestrated to avoid a US ban.

False claims

With about six weeks to the November 3 election, social media companies' responses to misinformation on their platforms are in the spotlight. On TikTok, Reuters found videos containing false claims about mail-in voting and presidential candidates, several of which TikTok removed after they were flagged by Reuters.

Searching TikTok for 'mailinvote' returns suggestions including 'mailinvotingfraud,' used on videos both spreading and debunking concerns.

A search for the Democratic presidential candidate 'Biden' turned up 'bidentoucheskids' and 'bidensniffskids.' Following questions from Reuters, a TikTok spokeswoman said it was no longer serving results on those hashtags.

Allegations of child sexual abuse are a key element of the QAnon conspiracy theory, which proposes Trump is secretly fighting a cabal of child-sex predators including prominent Democrats and "deep state" allies.

TikTok recently said it had blocked dozens of Q-Anon related hashtags. But misinformation researcher Rory Smith at non-profit First Draft identified several more still being used, including 'thestormisuponus, '2q2q,' 'digitalsoldier,' and 'jfkjr' which collectively had hundreds of thousands views. Following Reuters questions, TikTok said it had blocked some of these hashtags.

TikTok's content moderation practices have come under some scrutiny, including from US lawmakers concerned it may be censoring politically sensitive content after reports that it blocked videos about protests in Hong Kong. A TikTok spokeswoman said its content and moderation policies are led by a team in California and are not influenced by any foreign government.

In June, the company apologised after it was accused of censoring #BlackLivesMatter content, blaming a technical glitch it said made posts appear to have zero views.

This year, the company announced a council of outside experts would help it shape US content policies. Rob Atkinson, a council member and president of the Information Technology and Innovation Foundation think-tank, told Reuters he has advised the company a number of times about how policy decisions, such approaches to hate speech, might play in Washington, DC.

But data security concerns also stemming from TikTok's ownership by Chinese tech giant ByteDance have kept major US political figures and groups largely off the app, allowing it to avoid the scrutiny faced by Facebook and Twitter over their handling of inflammatory posts by Trump or other candidates.

The Democratic National Committee Chief Technology Officer Nellwyn Thomas told Reuters the DNC has not engaged with TikTok in much depth, and focuses its counter-disinformation work more on Facebook and Twitter. A TikTok spokeswoman said the company has provided the Republican and Democratic National Committees with direct ways to escalate problems.

Graham Brookie, director of the Atlantic Council's Digital Forensic Research Lab said the counter-disinformation community would be "remiss" not to communicate with TikTok: "We don't get to choose...where we have vulnerabilities."

Misinformation experts remain concerned about the difficulties of moderating TikTok's many-layered videos, which can involve visual and sound effects, overlaid text and hashtags. Users can use 'green screen' effects to share news articles behind them or create split-screen 'duets' with existing videos.

TikTok's comedic tone also makes it hard to tell spoof from skullduggery: last month, the company removed a video shared by the Republican Hype House, which the account's 17-year-old founder Aubrey Moore said was satire.

In the video, which liberal media watchdog Media Matters For America said racked up at least 40,000 views, a Republican Hype House creator, alongside 'BREAKING NEWS' text, falsely claimed that due to COVID-19, Democrats should head to the polls after Election Day. Though Moore said it was a gag, she said the group did not bother to appeal the takedown: it pumps out multiple videos a day.

TikTok advisory council member Farid said he had suggested to the company, partly to be provocative, that to curb misinformation and abuses it could ban new videos in the United States for a few days before and after the election.

His Plan B? "Honestly, I don't know," he said. "I'm struggling with that."

© Thomson Reuters 2020


Is Android One holding back Nokia smartphones in India? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.

Affiliate links may be automatically generated - see our ethics statement for details.
Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Rs. 99 Disney+ Hotstar Premium Offer: Flipkart Calls the Listing an ‘Unexpected Error’
iOS 14 Bug Resets Default Mail and Browser App Settings After Every Reboot
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »