West Virginia is suing Apple alleging negligence over CSAM materials

Published: (February 19, 2026 at 11:46 AM EST)
2 min read
Source: Engadget

Source: Engadget

Lawsuit Overview

The office of the Attorney General for West Virginia announced Thursday that it has filed a lawsuit against Apple alleging that the company “knowingly” allowed its iCloud platform to be used as a vehicle for distributing and storing child sexual abuse material (CSAM). The state claims this occurred for years without action from Apple, citing “the guise of user privacy.”

Allegations

  • The lawsuit repeatedly cites a text from Apple executive Eric Friedman, in which he calls iCloud “the greatest platform for distributing child porn” during a conversation with another Apple executive.
  • These messages were first uncovered by The Verge in 2021 within discovery documents for the Epic Games v. Apple trial. In the conversation, Friedman says that while some other platforms prioritize safety over privacy, Apple’s priorities “are the inverse.”
  • The state further alleges that detection technology to help root out and report CSAM exists, but Apple chooses not to implement it. Apple did consider scanning iCloud Photos for CSAM in 2021, but abandoned those plans after pushback stemming from privacy concerns.

In 2024, Apple was sued by a group of over 2,500 victims of child sexual abuse, citing nearly identical claims and alleging that Apple’s failure to implement these features led to the victims’ harm as images of them circulated through the company’s servers. At the time, Apple told Engadget:

“Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”

Potential Outcomes

The West Virginia case would mark the first time a governmental body is bringing such an action against the iPhone maker. The state says it is seeking injunctive relief that would compel Apple to implement effective CSAM detection measures, as well as damages. Apple has been contacted for comment.

0 views
Back to Blog

Related posts

Read more »