TikTok’s Black Box Obscures Its Role in Russia’s War
Ten days into Russia’s invasion of Ukraine, TikTok announced it had suspended new posts from Russian accounts due to the country’s new “fake news” law. But the company was quieter about a second policy shift—one that blocked TikTok users in Russia from seeing any content posted by accounts located outside the country.
Findings by social media research collective Tracking Exposed suggest that TikTok enfolded its Russian users in a vast echo chamber intended to pacify president Vladimir Putin’s government. Inside that digital enclave, a network of Russian accounts posting pro-invasion content somehow kept operating. “There was clear manipulation of the information ecosystem on TikTok,” says Salvatore Romano, head of research at Tracking Exposed.
TikTok spokesperson Jamie Favazza declined to comment on Tracking Exposed’s findings and repeated a previous statement that the company had blocked new uploads from Russia. But the platform, owned by Chinese startup ByteDance, has been less critical of Russia than US rivals and has been treated less harshly by Russia’s government. TikTok complied with EU sanctions forcing platforms to block access to Russian state-backed media from Europe. Meta, Google, and Twitter have also adjusted their algorithms to make content or links to those outlets less visible. In apparent retaliation, Facebook and Twitter were both blocked by Russian internet censors. On March 21, a Moscow court banned Facebook and Instagram from Russia, accusing parent company Meta of “extremist activities.”
TikTok’s actions in Russia and its central role in spreading video and rumor from the war in Ukraine add urgency to open questions about how truth and mistruth circulate on the platform, Romano and other researchers say. TikTok’s geopolitical moment also highlights the challenges faced by researchers trying to answer such questions. The app, launched in 2017, surpassed 1 billion monthly users in September 2021, but it is less well studied, and more difficult to study, than its older rivals.
Most work on the dynamics and downsides of social media has focused on Facebook and Twitter. Tools and techniques developed for those platforms have shone revealing light on the spread of misinformation about Covid-19 and uncovered online manipulation campaigns linked to governments, including Russia, China, and Mexico. Meta and Twitter provide APIs to help researchers see what is circulating on their platforms.
TikTok does not provide a research API, making it hard to answer questions about its role in spreading accurate or inaccurate information around the Ukraine war or other topics. And while researchers might like to see Meta and Twitter provide broader data access, these platforms at least offer something, says Shelby Grossman, a researcher who has been monitoring pro-Russian posts about Ukraine at Stanford’s Internet Observatory. “It’s tough to look systematically at what’s happening on TikTok,” she says. Researchers have also scrambled to monitor content about Ukraine on messaging app Telegram, which also lacks a researcher API and is much less studied than US networks.
TikTok spokesperson Favazza says that although it does not currently provide a research API, “we strongly support independent research,” citing a program that briefs lawmakers and experts in online harms on its moderation and recommendation systems. TikTok has previously claimed the war in Ukraine prompted it to increase moderation and speed up a pilot project labeling state-controlled media accounts but did not specify exactly how its operations have changed. On March 24, two TikTok moderators filed a lawsuit against the company alleging psychological harm from “exposure to highly toxic and extremely disturbing images.”
One of the biggest challenges to outside researchers interested in what circulates on TikTok stems from the power and influence of its recommendation algorithm, which plays an outsize role compared to older social networks. The app and its rapid growth are built on the For You page, which shows an endless feed of videos curated by TikTok’s algorithm and drawn largely from accounts a user does not follow. As a result, different people see wildly different videos, with the feed based on past viewing and other signals.
Source link