

Salt Lake City, Utah — July 31, 2025
Newly unsealed portions of Utah’s lawsuit against Snap, Inc., the parent company of Snapchat, offer a clearer—and more troubling—picture of the state’s allegations against the tech giant. The original complaint, filed in June 2025 by the Utah Department of Commerce’s Division of Consumer Protection in coordination with the Utah Office of the Attorney General, accuses Snapchat of practices that the state says endanger children and violate consumer protection laws.
According to newly revealed details:
- High Usage Among Teens: Utah teens have logged nearly 8 billion minutes on Snapchat since 2020. Over half a million users in the state reportedly use the app between 10 PM and 5 AM—hours often associated with heightened vulnerability among youth.
- Concerns Over AI Rollout: Internal messages from Snap engineers called the deployment of the “My AI” chatbot “reckless,” citing inadequate testing. Some employees warned the tool could be manipulated to provide harmful or inappropriate advice—a concern echoed by reports that the AI offered guidance to minors on illicit topics.
- Data Collection and Disclosure: The complaint alleges that My AI continued collecting user location data even when users had enabled “Ghost Mode.” It also claims Snap shared user content, including AI conversations, with external partners such as OpenAI and Microsoft Advertising, without clearly informing users.
- Inadequate Moderation and Safety Tools: According to Snap’s own internal assessments, the platform has struggled to control illicit activity. The complaint notes that the company acknowledged being “overrun” with instances of sexual extortion and drug access. Additionally, more than 96% of in-app abuse reports allegedly went unreviewed by the company’s safety team.
Utah officials argue that these disclosures demonstrate a systemic failure to protect minors, an accusation Snap has previously denied. The unredacted material marks a significant escalation in the state’s legal effort to hold tech platforms accountable for youth safety and data transparency.
The full complaint is available at https://tinyurl.com/dcpvsnapunredacted.
In a statement to TechBuzz, Snap pushed back on Utah’s litigation, asserting that the company has “no higher priority than the safety of Snapchatters.” It said privacy and safety features have been built into the platform from the start and highlighted tools such as Family Center, the Family Safety Hub, and private-by-default settings as evidence of its commitment to teen safety. Snap also underscored its support for federal legislation like the Kids Online Safety Act (KOSA).
The company defended the rollout of its My AI chatbot, calling the Utah complaint’s allegations “misleading.” Since My AI’s public launch in April 2023, Snap says it has added new safety and privacy protections, actively monitors the tool, and claims that 99.5% of responses adhere to its Community Guidelines. According to Snap, users send around two million messages daily to the chatbot—mostly on benign topics such as sports, pets, and entertainment.
Snap further reiterated its broader platform design philosophy, stating it “cares deeply about the well-being of young people.” It emphasized that Snapchat was intentionally built to minimize social comparison—unlike traditional social media—by excluding public metrics like visible friend lists or like counts. Most user activity, the company noted, takes place in private conversations.
To reinforce its position, Snap cited third-party research from the University of Amsterdam, which found Snapchat to be the only major social platform with a positive well-being impact on teens. It also referenced its annual Digital Well-Being Index (DWBI), a global study of Gen Z’s online mental health habits across platforms, released each year on International Safer Internet Day.