TikTok, Facebook approve ads with US election disinformation, study says

This combination of pictures created on June 2, 2024 shows a man holding a smartphone displaying the logo of Chinese social media platform Tiktok in an office in Paris on April 19, 2024 and former US President and Republican presidential candidate Donald Trump speaking to the media as he arrives for his criminal trial at Manhattan Criminal Court in New York City on May 30, 2024.
Photo by Antonin UTZ and Seth Wenig / various sources / AFP

WASHINGTON, United States — TikTok and Facebook approved advertisements containing blatant US election falsehoods just weeks ahead of the vote, a watchdog investigation revealed Thursday, calling into question the tech platforms' policies to detect harmful disinformation.

The advocacy group Global Witness submitted eight ads containing false election claims to the Chinese-owned video-sharing app TikTok, the Meta-owned Facebook, and Google-owned YouTube to test their ad systems in the final stretch of the November 5 election.

The ads carried outright election falsehoods -- such as people can vote online -- as well as content promoting voter suppression, inciting violence against a candidate, and threatening electoral workers and processes.

TikTok "performed the worst," Global Witness said, approving four of them despite its policy that prohibits all political ads.

Facebook approved one of the ads submitted.

"Days away from a tightly fought US presidential race, it is shocking that social media companies are still approving thoroughly debunked and blatant disinformation on their platforms," said Ava Lee, the digital threats campaign leader at Global Witness.

The study comes as researchers warn of the growing perils of disinformation -– both from domestic actors and foreign influence operations –- during a tight election race between the Democratic contender, Vice President Kamala Harris, and Republican nominee Donald Trump.

"In 2024, everyone knows the danger of electoral disinformation and how important it is to have quality content moderation in place," Lee said.

"There's no excuse for these platforms to still be putting democratic processes at risk."

Growing scrutiny

A TikTok spokeswoman said four of those ads were "incorrectly approved during the first stage of moderation."

"We do not allow political advertising and will continue to enforce this policy on an ongoing basis," she told AFP.

A Meta spokeswoman pushed back against the findings, saying they were based on a small sample of ads and therefore "not reflective of how we enforce our policies at scale."

"Protecting the 2024 elections online is one of our top priorities," she added.

Global Witness said the ad approved by Facebook falsely claimed that only people with a valid driver's license can vote.

Several US states require voters provide a photo ID, but do not say that it must be a driver's license.

Global Witness said YouTube initially approved half of the ads submitted, but blocked their publication until formal identification, such as a passport or driver's license, was provided.

The watchdog called that a "significantly more robust barrier for disinformation-spreaders" compared to the other platforms.

Platforms are facing growing scrutiny following the chaotic spread of disinformation in the aftermath of the 2020 election, with Trump and his supporters challenging the outcome after his defeat to Joe Biden.

Google on Thursday said it will "temporarily pause ads" related to the elections after the last polls close on November 5.

The tech giant said the measure, also introduced during the 2020 election, was expected to last a few weeks and was being implemented "out of an abundance of caution and to limit the potential for confusion," given the likelihood that vote counting will continue after Election Day.

Separately, Meta has said it will block new political ads during the final week of the election campaign.

Show comments