TikTok knew the social media platform was harmful to youth, and was actively trying to keep them on the platform, Republican Attorney General Russell Coleman alleges in a new lawsuit. Using internal documents, the lawsuit alleges TikTok was aware that their safety measures were largely ineffective in reducing youth screen time or protecting them from harmful content that violates TikTok’s own guidelines.
“TikTok was specifically designed to be an addiction machine, targeting children who are still in the process of developing appropriate self-control. It doesn’t take much for our kids to fall headfirst into a digital world of unrealistic beauty standards, bullying and low self-esteem,” Coleman said in a statement. “If we don’t hold TikTok accountable, our children will suffer the very real consequences. Nothing less than their mental, physical and emotional health are on the line.”
Large swathes of the 119-page Kentucky court filing are redacted; however, Kentucky Public Radio was able to read the text underneath the digital redaction which appeared to primarily quote and summarize findings from internal TikTok documents and communications.
More than a dozen other attorneys general across the political spectrum are also suing TikTok in state court, saying the company misled the public about the safety of the platform. Attorneys general in New York, California the District of Columbia, Illinois, Louisiana, Massachusetts, Mississippi, North Carolina, New Jersey, Oregon, South Carolina, Vermont and Washington have all filed under their states' or district's consumer protection laws.
“TikTok intentionally manipulates the release of dopamine in Young Users’ developing brains and causes them to use TikTok in an excessive, compulsive, and addictive manner that harms them both mentally and physically,” read the Kentucky lawsuit, which was filed Tuesday in Scott County.
Like most social media apps, TikTok tries to keep users engaged for as long as possible. But the lawsuit alleges certain design features encourage excessive use and harm young users, including a hyper-personalized algorithm that creates content “rabbit holes,” the ability to scroll endlessly and the app’s use of push notifications.
TikTok spokesman Alex Haurek told NPR the accusations in the lawsuits are misleading and said they had hoped the attorneys general would have worked with the company on “constructive solutions to industrywide challenges."
"We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features such as default screen time limits, family pairing, and privacy by default for minors under 16," Haurek said.
TikTok is already dealing with other challenges in the U.S., most notable a nationwide ban of the app that will take effect Jan. 19 unless the company bows to an ultimatum that it sever ties with its China-based parent company, ByteDance.
TikTok does not allow children under 13 years old to use the platform. However, the lawsuit asserts that the app does not utilize age verification software. Young TikTok users are also not allowed to send direct messages, and their accounts are automatically set to private. The app also uses screen time reminders to nudge users about how long they have been scrolling.
Those same safety features are also targeted in the lawsuit, as Coleman’s office argues the company knew that “none of TikTok’s safeguards are meaningful counterbalances to the real and profound harms caused by the design elements.”
In ineffectively redacted portions of the lawsuit, the state summarizes internal documents in which TikTok employees seem to acknowledge that their goal with such safeguards, like the screen time nudges, were meant to have limited effect on actual screen time, and indeed, appeared to have a “negligible impact.”
“TikTok measured the success of the tool, however, not by whether it actually reduced the time teens spent on the platform to address this harm, but by three unrelated ‘success metrics,’ the first of which was ‘improving public trust in the TikTok platform via media coverage,’” the lawsuit reads.
According to the court filing, in an experiment on the screen time use prompts, which TikTok refers to publicly as a limit, the average time per day that teens spent on the platform went from 108.5 minutes to about 107 minutes.
“Despite seeing this result, and the fact that the decrease in screen time was far less than the amount TikTok expected and had approved as acceptable, the company did not revisit the design of the tool to be more effective at preventing excessive use of TikTok,” the Kentucky court filing reads.
The lawsuit also alleges that TikTok was aware of how harmful excessive use could be, especially to minors, pointing to things like disrupted sleep patterns and filters that the state says promotes specific beauty standards. In redacted portions of the lawsuit, the state cites an internal document that says the effect perpetuates “a narrow beauty norm” that could “negatively impact the wellbeing of our community.”
The lawsuit cited another internal document which allegedly shows that the algorithm at one point prioritized videos featuring “not attractive subjects” and that TikTok addressed what the company saw as an issue by changing their algorithm. The wording seems to suggest that after learning an algorithm amplified what it considered unattractive people, the company knowingly adjusted the algorithm to exclude those people.
The state also argued internal documents showed TikTok prized young users, including those in Kentucky. One cited internal report appeared to break down the preferences and demographic data of new users classified as “country rural.”
“This analysis of users—including their topics of interest and their specific locations in Kentucky—was done for purposes of growing TikTok’s market share within the Commonwealth,” the lawsuit alleges.
The attorneys general also argue that TikTok’s moderation efforts are ineffective, allowing harmful content to filter down to young users. They also claim that TikTok’s live-streaming feature is often abused, and that thousands of underage users have hosted live-streamed videos where users can pay to send digital currency in the form of TikTok “gifts.” The lawsuits say the videos have incentivized the sexual exploitation of children.
“The existence of these virtual rewards greatly increases the risk of adult predators targeting adolescent users for sexual exploitation,” the Kentucky suit reads.
Coleman’s office asked the court to find that TikTok has violated the Kentucky Consumer Protection Act and award the state up to $2,000 for every violation, issue an injunction against TikTok, and order TikTok to give up all profits from its “ill-gotten gains.”
Kentucky Public Radio's Joe Sonka contributed to this report.
State government and politics reporting is supported in part by the Corporation for Public Broadcasting.