Apple Sued by West Virginia for Allegedly Allowing CSAM Distribution Through iCloud

Published: (February 19, 2026 at 01:05 PM EST)
2 min read
Source: MacRumors

Source: MacRumors

Background

West Virginia Attorney General JB McCuskey announced a lawsuit against Apple, alleging that the company knowingly allowed iCloud to be used for the distribution and storage of child sexual abuse material (CSAM). McCuskey claims Apple has “done nothing about it” for years.

“Preserving the privacy of child predators is absolutely inexcusable. And more importantly, it violates West Virginia law. Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images, and stop re‑victimizing children by allowing these images to be stored and shared,” – Attorney General JB McCuskey

According to the lawsuit (see the [PDF]), Apple internally described itself as the “greatest platform for distributing child porn,” yet it submits far fewer CSAM reports than peers such as Google and Meta.

Apple’s Prior CSAM‑Detection Plans

Apple previously announced new child‑safety features, including a system that would detect known CSAM in images stored in iCloud Photos. After backlash from customers, digital‑rights groups, child‑safety advocates, and security researchers, Apple abandoned the plan.

Apple’s statement on the decision:

“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.”

Apple later explained that creating a tool for scanning private iCloud data would “create new threat vectors for data thieves to find and exploit.”

The lawsuit alleges that Apple has shirked its responsibility to protect children under the guise of user privacy and that the decision not to deploy detection technology is a choice, not passive oversight. Because Apple controls hardware, software, and cloud infrastructure end‑to‑end, the suit argues the company cannot claim to be an “unknowing, passive conduit of CSAM.”

Relief Sought

  • Punitive damages
  • Injunctive relief requiring Apple to implement effective CSAM detection measures

This article first appeared on MacRumors.com.

0 views
Back to Blog

Related posts

Read more »