AG James Leads Bipartisan Coalition of 14 AGs Alleging TikTok Addicted Young People to its Platform and Collected Their Data Without ConsentLawsuits Seek to Change TikTok’s Harmful Features to Protect Young PeopleNew York Attorney General Letitia James and California Attorney General Rob Bonta today co-led a bipartisan coalition of 14 attorneys general in filing lawsuits against the social media platform TikTok for misleading the public about the safety of its platform and harming young people’s mental health. The lawsuits, filed individually by each member of the coalition, allege that TikTok violated state laws by falsely claiming its platform is safe for young people. In fact, many young users are struggling with poor mental health and body image issues due to the platform’s addictive features and are getting injured, hospitalized, or dying because of dangerous TikTok “challenges” that are created and promoted on the platform. Attorney General James and the bipartisan coalition of attorneys general are seeking to stop TikTok’s harmful practices and impose financial penalties on the social media company.
“Young people are struggling with their mental health because of addictive social media platforms like TikTok,” said Attorney General James. “TikTok claims that their platform is safe for young people, but that is far from true. In New York and across the country, young people have died or gotten injured doing dangerous TikTok challenges and many more are feeling more sad, anxious, and depressed because of TikTok’s addictive features. Today, we are suing TikTok to protect young people and help combat the nationwide youth mental health crisis. Kids and families across the country are desperate for help to address this crisis, and we are doing everything in our power to protect them.”
According to the lawsuits filed by Attorney General James and the bipartisan coalition, TikTok’s underlying business model focuses on maximizing young users’ time on the platform so the company can boost revenue from selling targeted ads. TikTok uses an addictive content-recommendation system designed to keep minors on the platform as long as possible and as often as possible, despite the dangers of compulsive use.
TikTok’s Addictive Features Worsen Young Users’ Mental Health
TikTok uses a variety of addictive features to keep users on its platform longer, which leads to poorer mental health outcomes. Multiple studies have found a link between excessive social media use, poor sleep quality, and poor mental health among young people. According to the U.S. Surgeon General, young people who spend more than three hours per day on social media face double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety.
Some of these addictive features include:
- Around-the-clock notifications that can lead to poor sleep patterns for young users;
- Autoplay of an endless stream of videos that manipulates users into compulsively spending more time on the platform with no option to disable Autoplay;
- Attention-grabbing content that keeps young users on the platform longer;
- TikTok “stories” and TikTok live content that is only available temporarily to entice users to tune in immediately or lose the opportunity to interact;
- A highlighted “likes” and comments section as a form of social validation, which can impact young users’ self-esteem; and
- Beauty filters that alter one’s appearance and can lower young user’s self-esteem.
Beauty filters have been especially harmful to young girls, with studies reporting that 50 percent of girls believe they do not look good without editing their features and 77 percent saying they try to change or hide at least one part of their body using these filters. Beauty filters can cause body image issues and encourage eating disorders, body dysmorphia, and other health-related problems.
TikTok Challenges Lead to Dangerous Outcomes
TikTok challenges are viral videos that encourage users to perform certain activities, some of which have been harmful and sometimes deadly for young users.
In one example, a 15-year-old boy died in Manhattan while “subway surfing,” a trend where people ride or “surf” on top of a moving subway car. After he passed away, his mother found videos on his TikTok account about subway surfing.
Another example of a dangerous TikTok challenge is the Kia Challenge, videos that show users how to hack the ignition to start and steal Kia and Hyundai car models, which has led to thousands of car thefts. In October 2022, four teenagers were killed in a car crash in Buffalo that police suspect was the result of the TikTok Kia Challenge. A Kia Forte was also stolen in New York City and crashed into a house in Greenwich causing significant damage to both the car and the residence. The ignition was damaged consistent with descriptions in the TikTok Kia Challenge.
TikTok Profits from Children’s Data
TikTok also violates the Children's Online Privacy Protection Act (COPPA), a federal law designed to protect children’s data on the internet. TikTok actively collects and monetizes data on users under 13 years old, in violation of COPPA, and does so without parental consent. Researchers estimate that 35 percent of TikTok’s U.S. ad revenue is derived from children and teenagers. While TikTok claims to only allow users over age 13 to access all of its features, TikTok’s deficient policies and practices have knowingly permitted children under the age of 13 to create and maintain accounts on the platform.
TikTok Falsely Claims Effectiveness of Safety Tools
TikTok falsely claims that its platform is safe for young users and has misrepresented the effectiveness of its so-called safety tools that are intended to address some of these concerns. Attorney General James’ lawsuit alleges that TikTok also violated New York’s consumer protection laws by misrepresenting its safety measures, including:
- Misleading users about its 60-minute screen time limit that it adopted to address concerns of compulsive use of its platform. TikTok deceptively advertised that teens can have a 60-minute screen time limit on the app. However, after using TikTok for 60 minutes, teens are simply prompted to enter a passcode to continue watching videos.
- Mispresenting the effectiveness of its “Refresh” and “Restricted Mode” features. TikTok claims that users can “Refresh” the content the recommendation system feeds them and that they can limit inappropriate content through “Restricted Mode.” However, those features do not work as TikTok claims.
- Failing to warn young users about the dangers of its beauty filter.
- Misrepresenting that its platform is not directed toward children. TikTok publicly claims that it is not for children under 13, however, the platform features child-directed subject matter, characters, activities, music, and other content, as well as advertisements directed to children.
Through these lawsuits, Attorney General James and the bipartisan coalition of attorneys general are using state laws to stop TikTok from using these harmful and exploitative tactics. In addition, the lawsuits seek to impose financial penalties, including disgorgement of all profits resulting from the fraudulent and illegal practices, and to collect damages for users that have been harmed.
Joining Attorney General James and California Attorney General Bonta in filing today’s lawsuit are the attorneys general of Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, New Jersey, North Carolina, Oregon, South Carolina, Vermont, Washington, and the District of Columbia. Each attorney general filed in their own state jurisdiction.
Today’s lawsuit is Attorney General James’ latest effort to hold social media companies accountable and protect children online. In September 2024, Attorney General James co-led a bipartisan coalition of 42 attorneys general in urging Congress to implement warning labels on social media platforms as called for by the United States Surgeon General. In June 2024, nation-leading legislation advanced by Attorney General James to combat addictive social media feeds and protect kids online was signed into law in New York. In March 2024, Attorney General James led a bipartisan coalition of 41 attorneys general in urging Meta to address the rise of Facebook and Instagram account takeovers by scammers and frauds. In December 2023, Attorney General James led a coalition of 22 attorneys general urging the U.S. Supreme Court to make it clear that states have the authority to regulate social media platforms. In October 2023, Attorney General James and a bipartisan coalition of 32 attorneys general filed a federal lawsuit against Meta for harming young people’s mental health and contributing to the youth mental health crisis.
New York Attorney General Letitia James and California Attorney General Rob Bonta today co-led a bipartisan coalition of 14 attorneys general in filing lawsuits against the social media platform TikTok for misleading the public about the safety of its platform and harming young people’s mental health. The lawsuits, filed individually by each member of the coalition, allege that TikTok violated state laws by falsely claiming its platform is safe for young people. In fact, many young users are struggling with poor mental health and body image issues due to the platform’s addictive features and are getting injured, hospitalized, or dying because of dangerous TikTok “challenges” that are created and promoted on the platform. Attorney General James and the bipartisan coalition of attorneys general are seeking to stop TikTok’s harmful practices and impose financial penalties on the social media company.
“Young people are struggling with their mental health because of addictive social media platforms like TikTok,” said Attorney General James. “TikTok claims that their platform is safe for young people, but that is far from true. In New York and across the country, young people have died or gotten injured doing dangerous TikTok challenges and many more are feeling more sad, anxious, and depressed because of TikTok’s addictive features. Today, we are suing TikTok to protect young people and help combat the nationwide youth mental health crisis. Kids and families across the country are desperate for help to address this crisis, and we are doing everything in our power to protect them.”
According to the lawsuits filed by Attorney General James and the bipartisan coalition, TikTok’s underlying business model focuses on maximizing young users’ time on the platform so the company can boost revenue from selling targeted ads. TikTok uses an addictive content-recommendation system designed to keep minors on the platform as long as possible and as often as possible, despite the dangers of compulsive use.
TikTok’s Addictive Features Worsen Young Users’ Mental Health
TikTok uses a variety of addictive features to keep users on its platform longer, which leads to poorer mental health outcomes. Multiple studies have found a link between excessive social media use, poor sleep quality, and poor mental health among young people. According to the U.S. Surgeon General, young people who spend more than three hours per day on social media face double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety.
Some of these addictive features include:
- Around-the-clock notifications that can lead to poor sleep patterns for young users;
- Autoplay of an endless stream of videos that manipulates users into compulsively spending more time on the platform with no option to disable Autoplay;
- Attention-grabbing content that keeps young users on the platform longer;
- TikTok “stories” and TikTok live content that is only available temporarily to entice users to tune in immediately or lose the opportunity to interact;
- A highlighted “likes” and comments section as a form of social validation, which can impact young users’ self-esteem; and
- Beauty filters that alter one’s appearance and can lower young user’s self-esteem.
Beauty filters have been especially harmful to young girls, with studies reporting that 50 percent of girls believe they do not look good without editing their features and 77 percent saying they try to change or hide at least one part of their body using these filters. Beauty filters can cause body image issues and encourage eating disorders, body dysmorphia, and other health-related problems.
TikTok Challenges Lead to Dangerous Outcomes
TikTok challenges are viral videos that encourage users to perform certain activities, some of which have been harmful and sometimes deadly for young users.
In one example, a 15-year-old boy died in Manhattan while “subway surfing,” a trend where people ride or “surf” on top of a moving subway car. After he passed away, his mother found videos on his TikTok account about subway surfing.
Another example of a dangerous TikTok challenge is the Kia Challenge, videos that show users how to hack the ignition to start and steal Kia and Hyundai car models, which has led to thousands of car thefts. In October 2022, four teenagers were killed in a car crash in Buffalo that police suspect was the result of the TikTok Kia Challenge. A Kia Forte was also stolen in New York City and crashed into a house in Greenwich causing significant damage to both the car and the residence. The ignition was damaged consistent with descriptions in the TikTok Kia Challenge.
TikTok Profits from Children’s Data
TikTok also violates the Children's Online Privacy Protection Act (COPPA), a federal law designed to protect children’s data on the internet. TikTok actively collects and monetizes data on users under 13 years old, in violation of COPPA, and does so without parental consent. Researchers estimate that 35 percent of TikTok’s U.S. ad revenue is derived from children and teenagers. While TikTok claims to only allow users over age 13 to access all of its features, TikTok’s deficient policies and practices have knowingly permitted children under the age of 13 to create and maintain accounts on the platform.
TikTok Falsely Claims Effectiveness of Safety Tools
TikTok falsely claims that its platform is safe for young users and has misrepresented the effectiveness of its so-called safety tools that are intended to address some of these concerns. Attorney General James’ lawsuit alleges that TikTok also violated New York’s consumer protection laws by misrepresenting its safety measures, including:
- Misleading users about its 60-minute screen time limit that it adopted to address concerns of compulsive use of its platform. TikTok deceptively advertised that teens can have a 60-minute screen time limit on the app. However, after using TikTok for 60 minutes, teens are simply prompted to enter a passcode to continue watching videos.
- Mispresenting the effectiveness of its “Refresh” and “Restricted Mode” features. TikTok claims that users can “Refresh” the content the recommendation system feeds them and that they can limit inappropriate content through “Restricted Mode.” However, those features do not work as TikTok claims.
- Failing to warn young users about the dangers of its beauty filter.
- Misrepresenting that its platform is not directed toward children. TikTok publicly claims that it is not for children under 13, however, the platform features child-directed subject matter, characters, activities, music, and other content, as well as advertisements directed to children.
Through these lawsuits, Attorney General James and the bipartisan coalition of attorneys general are using state laws to stop TikTok from using these harmful and exploitative tactics. In addition, the lawsuits seek to impose financial penalties, including disgorgement of all profits resulting from the fraudulent and illegal practices, and to collect damages for users that have been harmed.
Joining Attorney General James and California Attorney General Bonta in filing today’s lawsuit are the attorneys general of Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, New Jersey, North Carolina, Oregon, South Carolina, Vermont, Washington, and the District of Columbia. Each attorney general filed in their own state jurisdiction.
Today’s lawsuit is Attorney General James’ latest effort to hold social media companies accountable and protect children online. In September 2024, Attorney General James co-led a bipartisan coalition of 42 attorneys general in urging Congress to implement warning labels on social media platforms as called for by the United States Surgeon General. In June 2024, nation-leading legislation advanced by Attorney General James to combat addictive social media feeds and protect kids online was signed into law in New York. In March 2024, Attorney General James led a bipartisan coalition of 41 attorneys general in urging Meta to address the rise of Facebook and Instagram account takeovers by scammers and frauds. In December 2023, Attorney General James led a coalition of 22 attorneys general urging the U.S. Supreme Court to make it clear that states have the authority to regulate social media platforms. In October 2023, Attorney General James and a bipartisan coalition of 32 attorneys general filed a federal lawsuit against Meta for harming young people’s mental health and contributing to the youth mental health crisis.
No comments:
Post a Comment