Instagram head pressed on lengthy delay to launch teen safety features, like a nudity filter, court filing reveals
Source: TechCrunch
Prosecutors in a lawsuit focused on whether social‑media apps are addictive and harmful asked why it took Meta so long to roll out basic safety tools, such as a nudity filter for private messages sent to teens. In April 2024, Meta introduced a feature that automatically blurs explicit images in Instagram DMs — an issue the company reportedly recognized nearly six years earlier【https://techcrunch.com/2024/04/11/meta-will-auto-blur-nudity-in-instagram-dms-in-latest-teen-safety-step/】.
Deposition of Instagram Head Adam Mosseri
In a newly unsealed deposition in a federal lawsuit, Instagram head Adam Mosseri was questioned about an August 2018 email chain with Meta VP and Chief Information Security Officer Guy Rosen【https://www.linkedin.com/in/guyro/】. The emails warned that “horrible” things could happen via Instagram private messages (DMs), including unsolicited explicit images. Mosseri acknowledged the risk.
When pressed about whether the company should have informed parents that its messaging system wasn’t monitored beyond the removal of CSAM (Child Sexual Abuse Material), Mosseri responded:
“I think that it’s pretty clear that you can message problematic content in any messaging app, whether it’s Instagram or otherwise.”
He added that Meta tries to balance users’ privacy interests with safety concerns.
New Statistics on Harmful Activity
The testimony revealed survey data on teens’ experiences on Instagram:
- 19.2% of respondents aged 13‑15 reported seeing nudity or sexual images they did not want to see.
- 8.4% of the same age group said they had seen someone self‑harm or threaten self‑harm on Instagram within the past seven days of use.
Delay of the Nudity Filter
While the nudity filter is just one of several recent updates aimed at protecting teens, prosecutors focused on the delay in implementing it rather than the current safety level of the app. The 2018 email chain was presented as evidence that Meta was aware of risks to minors but waited until 2024 to release a product addressing sexual images sent to teens, including those potentially used in grooming.
Meta’s Response
Meta spokesperson Liza Crenshaw highlighted the company’s broader teen‑safety initiatives:
“For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in‑depth research to understand the issues that matter most. We use these insights to make meaningful changes—like introducing Teen Accounts with built‑in protections and providing parents with tools to manage their teens’ experiences. We’re proud of the progress we’ve made, and we’re always working to do better.”
Ongoing Litigation
The deposition was part of a series of lawsuits seeking to hold big tech accountable for alleged harms to teens. This particular case is filed in the U.S. District Court for the Northern District of California【https://cand.uscourts.gov/cases-e-filing/cases/422-md-03047-ygr/re-social-media-adolescent-addictionpersonal-injury-products】 and alleges that platforms are defective because they are designed to maximize screen time, encouraging addictive behavior. Defendants include Meta, Snap, TikTok, and YouTube (Google).
Similar actions are underway:
- Los Angeles County Superior Court【https://techcrunch.com/2026/02/17/metas-own-research-found-parental-supervision-doesnt-really-help-curb-teens-compulsive-social-media-use/】
- New Mexico state court【https://apnews.com/article/meta-new-mexico-child-exploitation-trial-19195fc680dba782fb971d68082e11a4】
Lawyers across these cases aim to demonstrate that big‑tech companies prioritized user growth and engagement over the potential harms to their youngest users.
Legal Context
The timing of these trials coincides with a growing wave of legislation restricting teen use of social media, both in several U.S. states and internationally.
Updated after publication with Meta’s comment.