Skip to Content

Apple allowed child sexual abuse materials on iCloud for years, West Virginia Attorney General claims

<i>Alexander Pohl/NurPhoto via Getty Images via CNN Newsource</i><br/>Apple allowed child sexual abuse materials on iCloud for years
Alexander Pohl/NurPhoto via Getty Images via CNN Newsource
Apple allowed child sexual abuse materials on iCloud for years

By Ramishah Maruf, Clare Duffy, CNN

New York (CNN) — The West Virginia attorney general’s office sued Apple on Thursday, claiming the tech giant allowed child sexual abuse materials (CSAM) to be stored and distributed on its iCloud service.

The lawsuit claims that Apple prioritized user privacy over child safety for years. The company has tight control over its hardware, software and cloud infrastructure, meaning it cannot claim to be unaware of the issue, the attorney general’s office argued.

Apple has for years faced competing criticisms that it should do more to prevent CSAM from being stored and shared on its services, but also that it should protect the privacy of its users’ personal photos and documents.

The lawsuit says US-based tech companies are federally required to report this detected content to the National Center for Missing and Exploited Children. While Google filed 1.47 million reports in 2023, Apple allegedly filed only 267.

“These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,” West Virginia Attorney General JB McCuskey said in a news release. “This conduct is despicable, and Apple’s inaction is inexcusable.”

“At Apple, protecting the safety and privacy of our users, especially children, is central to what we do. We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids,” an Apple spokesperson said in a comment to CNN.

Apple also pointed to a feature the company offers called Communication Safety that warns children and blurs the image when nudity is detected while receiving or attempting to send content and displays a warning. It works in apps like Messages and FaceTime, as well as over AirDrop and in the iPhone’s Contact Posters feature and the Photos app image selection tool. The spokesperson added that Apple’s parental controls and features “are designed with the safety, security, and privacy of our users at their core.”

McCuskey said during a news conference on Thursday that large companies with vast resources like Apple have a responsibility to address safety issues like these.

“There is a social construct that dictates that you also have to be part of solving these large-scale problems, and one of those problems is the proliferation and exploitation of children in this country,” he said.

The suit alleges that Apple’s iCloud storage system “reduces friction” for users to repeatedly access and distribute CSAM because it makes it easy to view and search for images and videos across devices.

It’s illegal to possess CSAM in the United States and many other countries.

Apple has built its brand around privacy guarantees for users. But the lawsuit claims that Apple and its leaders have known about the company’s CSAM issues. The complaint includes a screenshot of what it describes as a 2020 text message conversation about the issue, in which one executive suggested the company’s focus on privacy made it “the greatest platform for distributing child porn.”

Other tech companies use tools like Microsoft PhotoDNA to detect child exploitation images, the West Virginia attorney general’s office said. Microsoft says it provides this technology for free to qualified organizations, including tech companies.

Apple said in 2021 it would use its own model called NeuralHash to detect child sexual abuse materials. But it abandoned the plan following backlash from critics about privacy concerns, deciding instead to focus on the Communication Safety feature.

The complaint alleges NeuralHash is a far inferior tool to PhotoDNA. It accuses Apple of negligence for “failing to implement adequate CSAM reporting technologies,” among other claims.

The lawsuit comes as there is increased scrutiny on the effects of Big Tech’s impact on children. In 2023, the New Mexico Attorney General’s office accused Meta of shutting down accounts it used to investigate alleged child sexual abuse on Facebook and Instagram. New Mexico Attorney General Raúl Torrez accused Meta in the lawsuit of creating a “breeding ground” for child predators on those platforms.

Meta strongly pushed back on the claims at the time, saying that “we use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators.” More recently, Meta spokesperson Stephanie Otway said in a statement that the New Mexico suit contains “sensationalist, irrelevant and distracting arguments.”

West Virginia’s attorney general’s office is seeking statutory and punitive damages, injunctive relief, as well as requirements for Apple to implement effective detection measures.

This story has been updated with additional information and context.

The-CNN-Wire
™ & © 2026 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Business/Consumer

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KTVZ is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.