If you ever wondered how long it takes for conspiracy theories to start gaining steam on social media, a study published this week may have a possible answer for you: a little more than one day.
A new study from University of Alberta researchers Marco Zenone and Timothy Caulfield documented just how fast health misinformation spreads on TikTok, one of the most popular platforms of the moment. After the World Health Organization recommended public health officials probe public sentiment and address possible misinformation around monkeypox back in May, Zenone and Caulfield decided to conduct a rapid assessment to find out where and how quickly conspiracy theories were spreading on TikTok.
“The study highlights how fast misinformation and conspiracy theories emerge. We should be monitoring platforms to get a sense of the bunk themes so science-informed, engaging, and shareable content can be created and used ASAP to counter the noise,” Caulfield told Gizmodo via email. “For those using TikTok, the study highlights, once again, how much misinformation there is on these platforms.”
To conduct their assessment, the researchers collected and analyzed 864 videos with the hashtag #monkeypox on TikTok on a specific day in May. They identified 153 videos containing conspiracy theories on monkeypox, which generated more than 1.4 million views, 74,328 likes, 7,890 comments, and 13,783 shares.
On average, the videos in their sample were 30.2 hours old, meaning it took them a little more than a day to start making the rounds on TikTok. The researchers published their study in JAMA Network Open Tuesday.
Deals for only 24 hours
If you need to make a big furniture or home purchase, this is your chance.
Zenone and Caulfield identified 11 conspiracy theories associated with #monkeypox hashtag on TikTok. The three most popular theories propagated the false ideas that monkeypox was the next orchestrated pandemic, that monkeypox was introduced to force more people to receive vaccines, and that Microsoft cofounder Bill Gates was involved in the monkeypox outbreak. None of these conspiracy theories are true.
“It’s almost like they knew it was coming, like it was a giant plan, from one pandemic to the next, that’s all it’s going to be now guys, that way they can keep all the control they want and keep everyone scared….This is a giant plan,” one person said in a TikTok video, according to a transcript in the study.
Other conspiracy theories speculated that monkeypox was a ploy to give the WHO power over sovereign countries, that the monkeys that got loose in Pennsylvania had monkeypox and were released on purpose, and that monkeypox was created in lab, among others. Again, none of these are true.
The study was not without its limitations. According to the authors, one limitation was that they only analyzed videos on TikTok in English under one hashtag. They acknowledged that there were likely TikTok videos with conspiracy theories in other languages and different hashtags.
When asked about Zenone and Caulfield’s study, TikTok told Gizmodo in an email on Friday that it works with the WHO to provide accurate information to its users.
“We remove medical misinformation about monkeypox and have partnered with the World Health Organization to make it easier for people to access facts through video labels, search, prompts, and hashtag PSAs,” a spokesperson said. “We also work with independent fact-checkers who assess content so that we can consistently remove violations of our policies.”
The spokesperson said that TikTok would remove the content cited in the study if the researchers provided them with the links. The study included a summary of the conspiracy theories found and their themes, but not the links.
Caulfield said the study is a reminder that public health officials need to adopt a range of response to address misinformation on social media, including “rapid debunking and content-informed pre-bunking.”
“We can warn people that about the kind of bunk that is emerging,” he said.