February 23, 2026

West Virginia sues Apple over CSAM failures

west virginia sues apple over csam failures
Photo source: CNN

West Virginia’s Attorney General has filed a pioneering lawsuit against Apple, alleging the tech giant enabled the storage and sharing of child sexual abuse material (CSAM) on iCloud and its devices.

The case claims Apple’s focus on user privacy long trumped child safety, despite its full control over hardware, software, and cloud systems, making ignorance implausible.

A 2020 executive text cited in the suit called iCloud “the greatest platform for distributing child porn” due to privacy features. Apple has faced pressure to combat CSAM while protecting user data. Yet in 2023, while Google reported 1.47 million CSAM instances to the National Center for Missing and Exploited Children and Meta over 30.6 million, Apple filed just 267.

“These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,” Attorney General JB McCuskey said in a release. “This conduct is despicable, and Apple’s inaction is inexcusable.”

apple csam failures
Photo source: BBC

Apple countered that “protecting the safety and privacy of our users, especially children, is central to what we do. We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.” Its Communication Safety tool blurs nudity in Messages, FaceTime, and AirDrop, with parental controls emphasising safety and privacy.

McCuskey stressed corporate duties. “There is a social construct that dictates that you also have to be part of solving these large-scale problems, and one of those problems is the proliferation and exploitation of children in this country,” he said.

The suit faults iCloud for easing CSAM access across devices and criticises Apple for skipping tools like Microsoft’s free PhotoDNA, unlike rivals. Apple’s 2021 NeuralHash plan was abandoned over privacy fears. Possession of CSAM is illegal in the U.S. and beyond.

This follows suits like New Mexico’s against Meta for enabling predators. “We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children,” Meta responded then.

West Virginia demands damages and mandatory detection upgrades.

Subscribe for weekly news

Subscribe For Weekly News

* indicates required