By Adam Rawnsley
When the Jan. 6 committee wanted to test how easy it was for TikTok users to wander down a far-right rabbit hole, they tried an experiment. They created Alice, a fictional 41-year-old from Acton, Massachusetts, gave her a TikTok account, and tracked what the social media app showed her.
To their surprise, it only took 75 minutes of scrolling — with no interaction or cues about her interests — for the platform to serve Alice videos featuring Nazi content, following a detour through clips on the Amber Heard-Johnny Depp defamation suit, Donald Trump, and other right-wing culture war flashpoints.
Staff described the exercise as “just one of the Committee’s experiments that further evidenced the power of TikTok’s recommendation algorithm in creating rabbit holes toward potentially harmful content.”
The experiment is detailed in a draft summary of investigative findings prepared by the committee’s social media team and obtained by Rolling Stone. The company mostly escaped notice in the public battles over the role of social media and moderation in combating extremism, including the kind that led to the Capitol attack. But the unpublished summary sheds new light on how the TikTok has grappled with the challenge of “how to moderate misleading content without attracting accusations of censorship,” in particular when “the mis- and disinformation benefitted the political right,” according to staffers.
TikTok did not respond to a request for comment from Rolling Stone.
The revelations come at a delicate time for the Chinese-owned social media app, which is facing renewed criticism from federal agencies and Congress over the alleged risks its ownership poses to national security.
Over the past few weeks, several states and the Biden administration have banned the app from government devices on security grounds. Others, like FBI Director Chris Wray, have expressed concerns that Chinese officials could some day use their leverage over the company and the data it collects “to manipulate content, and if they want to, to use it for influence operations.”
TikTok survived its first brush with regulation when former President Trump tried and failed to ban the app from the U.S. and force its Chinese parent company, ByteDance, to sell off the platform to an American company.
In their summary, Jan. 6 committee staffers described the moderation and enforcement policies of major social media companies like Facebook and Twitter as constantly hindered and shaped by fears of Republican criticism.
That same precarious political balancing was evident in the summer of 2020 as TikTok staffers sat down to write policies addressing the range of online pitfalls like deepfakes, troll networks, misinformation, and disinformation. One document TikTok shared with the committee notes that “one TikTok staffer modified the descriptions of a policy proposal on mis- and disinformation because ‘otherwise it may pick up much of Fox News.’”
But unlike the mainstream social media giants, the committee’s social media team concluded that TikTok “does not appear to have been a major source of news and information” for Jan. 6 insurrectionists. Still, they encouraged continued scrutiny of TikTok because it “continues to attract the mix of hyper-partisan commentators, conspiracy theorists, and extremists active on other platforms.”
Staffers on the Jan. 6 committee found that TikTok’s “approach to trust & safety overlapped significantly with peer platforms” and in some cases outperformed them. The draft described TikTok as “ahead of its peers” in responding to Trump’s “Stop the Steal” movement because, unlike Facebook, the company had policies already in place banning election delegitimization, allowing it to respond more quickly.
Like Twitter and Facebook, TikTok has a list of roughly 200 high-profile accounts that require a second layer of review if moderators want to enforce policies. Such programs, like Facebook’s “Crosscheck,” have attracted criticism for allegedly insulating prominent celebrities and politicians from content moderation rules other users are subject to. TikTok’s high-profile list, the company told the committee, is comprised primarily of celebrities and influencers and relatively fewer politicians.
TikTok also appeared to be spared from much of the insurrectionist content found on other platforms by its audience, which skewed much younger than the median 35-year-old age of rioters, according to the committee.
The absence of such content wasn’t for lack of trying on the Trump campaign’s part. In a “strategic communications plan” from Rudy Giuliani previously released by the committee, the former New York City mayor urged the Trump legal team that “WE have to use TIKTOK!! Content goes VIRAL here like no other platform!!!!!. And there are MILLIONS of Trump supporters!” Giulaini’s memo exclaimed with prolific all-caps emphasis. “It would be amazing if POTUS would use the platform actually – he’d have the biggest account EVER.”
But in other areas, TikTok was slow to respond to content from known extremist groups who were involved in the insurrection. Company officials told the committee that it didn’t ban the far-right Boogaloo movement from its platform until April 2021, months after the riot.
According to the draft summary, Tiktok also received private complaints from the Democratic National Committee about its handling of QAnon content in Sept. 2020. The company, staffers wrote, responded to the message by stating that QAnon content violated its policy and it would “remove content and ban accounts.”
TikTok, like other companies, removes content that blatantly violates its rules and reduces the visibility and promotion of “grayzone” content where violations are more subjective. Company officials told the committee that 90 percent of content in the “grayzone” of its rules prohibiting QAnon received less than a thousand views. But committee staffers noted that the small sliver of QAnon content that slipped through grayzone enforcement still racked up millions of views.
We want to hear it. Send us a tip using our anonymous form.
Copyright © 2023 Penske Business Media, LLC. All Rights reserved.