Your guide to a better future
The videos may be short, but they could be filled with misleading claims.
Queenie Wong is a senior writer for CNET News, who focuses on social media companies including Facebook’s parent company Meta, Twitter and TikTok. She previously covered social networks for The Mercury News in San Jose. Before that, she wrote about politics and education for the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
A misleading 30-second TikTok video shows a woman questioning a Washington state election worker who’s holding out a bag to collect ballots.
“Why are you not allowing us to put them in the ballot box?” the woman asks as she drives closer to a ballot drop box and films the encounter with her smartphone.
The worker tells the woman she’s collecting ballots because voting closes promptly at 8 p.m. on Aug. 2, the day of the primary election in Clark County. The worker then informs the frustrated driver that she can still drop her ballot in the box if she wants to.
The TikTok video made its way to other platforms, including Twitter and Facebook, where users shared it and pushed bogus claims that the clip showed voter fraud. Some users falsely alleged that the worker was illegally closing the ballot box early. Fact-checkers debunked the claims, citing interviews with Clark County officials who confirmed that the election worker’s actions weren’t out of the norm — the worker was merely trying to help voters in line turn in their ballots on time.
The viral video is just one example of the type of misleading footage voters could encounter ahead of the US midterm elections in November. Even as social media companies have increasingly clamped down on written posts, they’ve yet to get a handle on short-form video. The TikTok clip, for instance, included a label directing users to the app’s election center, but it didn’t mention that the content is misleading.
It’s a troubling vulnerability considering the volume of short-form videos flooding every major platform, with huge players like Meta-owned Facebook and Instagram and Google’s YouTube embracing the format as a means to compete with rapidly growing TikTok. As more videos proliferate on these platforms, there’s a higher risk that these bite-size clips could be filled with misleading or false claims.
CNET isn’t linking to misleading videos or naming the users, to prevent the content from spreading. After CNET asked TikTok about the above-mentioned video and other examples, the platform pulled them down for violating TikTok’s rules against harmful misinformation.
Social networks will remove or label misinformation, but the amount of content posted online makes it impossible for companies to catch every false claim.
“People are going to be misled in many different ways, and sometimes those don’t have to be complex tactics,” said Jack Brewster, a senior analyst at NewsGuard, a tool that rates the credibility of news sources.
Analysts for NewsGuard found that almost 20 percent of the videos presented as TikTok’s search results contained misinformation.
“Our Community Guidelines make clear that we do not allow harmful misinformation and will remove it from the platform,” a TikTok spokesperson said in a statement. “We partner with independent fact-checkers who help us to assess the accuracy of content.”
Social media companies use human reviewers and artificial intelligence systems to moderate problematic content. Meta does label videos debunked by fact-checkers and Twitter has a fact-checking project called Birdwatch, where users will leave notes on misleading or false tweets. But videos incorporate text, sounds and images, making them trickier to review than written posts.
So it’s more important than ever that you’re aware of this kind of misleading content. Here are red flags to watch out for in short-form videos.
Fact-checkers are debunking misinformation on social media.
A video clip tells only part of a story, and people can edit footage in a way that leaves out important context. That’s particularly the case for something that lasts less time than a 30-second commercial.
Alex Mahadevan, director of the Poynter Institute’s MediaWise project, said it’s important for social media users to be aware of their emotions. MediaWise is trying to empower people to be more critical about the content they see online.
“When something freaks you out online, there’s probably a pretty good chance that it isn’t 100% true,” he said.
In August, fact-checkers debunked the claim that a 40-second TikTok video showed former US President Barack Obama promoting the spread of disinformation. The black-and-white video makes it seem as if Obama is presenting a playbook on how to spread disinformation, when in fact he’s speaking out against it.
The misleadingly edited clip shows Obama saying: “You just have to raise enough questions, spread enough dirt, plant enough conspiracy theorizing that citizens no longer know what to believe. Once they lose trust in their leaders, in mainstream media, in political institutions, in each other, in the possibility of truth, the game’s won.”
But when Obama said those words during a speech at Stanford University in April, he was talking about the tactics used by authoritarian regimes including Russia to spread disinformation. It didn’t take a lot of work for someone to spin those comments in the opposite direction.
Audio in videos can also be manipulated.
When users reshare clips on social media, the original context can get lost, and it might not always be obvious when someone is trying to make a joke or is using manipulated media. This can be especially true on TikTok, where users often riff off of one another by reusing and remixing each other’s audio and video.
In a TikTok post in August, one user included a manipulated video originally posted by someone else that makes it appear as if US Rep. Liz Cheney, a Republican from Wyoming, is announcing a bid for the White House in 2024 with Democrat Hillary Clinton. The August post includes the phrase “Holy Hell!”
The faked Cheney video puts these unlikely words into her mouth: “Hillary and I have one thing in common. We both have bigger balls than Donald Trump, plus our hands are a lot bigger too.” Though Cheney has said she’s thinking about a presidential run and has spoken out against Trump, she hasn’t announced a run for president or uttered the crude words she appears to say in the doctored video.
In this case, the over-the-top language in the faked video might be enough to tip users off that it’s meant as a gag. Yet some TikTokers who commented on the August post appeared confused about whether the clip was real or satire.
Figuring out who originally posted the doctored clip could give users some context and might make it clearer that it’s meant as a joke.
Here’s one thing users can do. Near the bottom of TikTok posts, there’s a link called out by a music-note icon. Clicking on that icon will take you to the account of the person who created any audio that’s been reused. In this case, CNET traced the faked Cheney clip back to an account that was full of obviously satirical videos.
Also, Mahadevan said he always encourages people to visit the profile of whoever’s sharing a video, because it could provide insight into whether that person’s posts are credible.
Researchers have found that compared with text and just audio, people are more likely to believe misinformation shared in video.
People tend to believe fake news in video more because when they see something in action with their own eyes, they have more faith that it’s credible, according to S. Shyam Sundar, a Penn State University professor and co-director of the college’s Media Effects Research Laboratory.
While people can easily be duped by videos that use artificial intelligence to make it seem like politicians and others are saying things they’re really not, there are also less tech savvy ways to manipulate audio. Audio clips have been slowed down to make it appear as if someone is slurring speech. And when Russia invaded Ukraine, some users added audio of gunfire and explosions to unrelated video clips to make it appear as if the clips showed scenes from the invasion.
“We have information overload,” Sundar said, noting that people are scrolling through TikTok videos, tweets and other social media posts quickly. “Given that kind of information environment, it’s very unlikely someone would systematically stop and say, ‘Well, let me just verify this is true.'”
Altered images are also something to watch out for in videos.
Green-screen effects in short-form video make it possible for TikTokers and others to easily create videos of themselves speaking in front of images. TikTok also lets users upload pictures and create slideshows. But these images can be faked.
The unaltered photo, shared by Oz.
Isaac Harte, a 15-year-old fact-checker in Pennsylvania, has been keeping a close eye on the US Senate race between Democrat John Fetterman and Republican Mehmet Oz. Harte is part of MediaWise’s teen fact-checking network.
One fake image that’s been shared widely on social media shows Oz standing next to a person who appears to be holding a sign for his campaign sideways so it reads “NO” vertically instead of “OZ.” The image was digitally altered to rotate the sign.
The faked photo appeared on Facebook, Instagram, Twitter and other social media sites, and CNET also found it on TikTok. One TikTok clip includes the altered image along with an animation of a cartoonishly rendered “NO” and someone shaking his head.
Though fact-checking a photo might seem intimidating to some people, Harte said it’s something you can easily do. If you do a reverse image search on the photo via Google, he said, you’ll see that it’s been fact-checked.
“I say fact-checking is not rocket science all the time,” he said.
There are also resources available online to improve your fact-checking skills. MediaWise offers free media literacy courses in English and Spanish, and PolitiFact has a page that points to its most recent fact-checks of TikTok posts.
Read more: Reverse Google Image Search Can Help You Bust Fake News and Fraud
TikTok Is a Misinformation Minefield. Don't Get Tripped Up – CNET
Your guide to a better future