US class action lawsuit: Apple ignores child abuse material in iCloud

Apple is using data protection as an excuse to evade responsibility for CSAM content in iCloud, according to the allegation. The plaintiff is a 9-year-old girl.

Save to Pocket listen Print view
Rostov-on-don,/,Russia,-,October,28,2019:,The,Iphone,With

(Image: nikkimeel/Shutterstock.com)

3 min. read

A lawsuit wants to force Apple to take tougher action against the distribution of abusive content in iCloud. The company knows that it has a "devastating problem" with child sexual abuse material (CSAM) and has nevertheless failed to take action, according to the submission to a US court. The plaintiff is a 9-year-old girl who is reportedly a victim of child abuse herself: Unknown persons had persuaded her via iMessage to create CSAM content of herself and share it via iCloud (Jane Doe v. Apple, United States District Court, Northern District Of California, Case No. 5:24-cv-510).

The lawsuit is primarily based on an iMessage message from a high-ranking Apple manager. In it, the manager writes to a colleague that the company offers the "best platform to distribute child pornography etc." due to its data protection efforts. This internal conversation from 2020 became public as a result of the major lawsuit filed by Epic Games against Apple.

Specifically, Apple is now accused of not scanning iCloud content, even though technology such as PhotoDNA, which is also used by other cloud giants, is available. Many iCloud contents are also not protected by end-to-end encryption and would therefore ultimately be verifiable, the plaintiff notes. The manufacturer could integrate a reporting device for encrypted material, it continues. It is also criticized that iCloud makes it easy to share photo albums and that traces can allegedly be easily covered.

The lawsuit is not directed against end-to-end encryption or data protection in general, the lawyers write. They are demanding certification as a class action, millions in damages and several requirements for Apple to scan iCloud for misuse material.

Three years ago, Apple announced a CSAM detection technology that was supposed to scan iCloud photos locally on the iPhone. However, the project was buried after massive criticism from customers, security researchers, civil rights activists and data protectionists. Such a hybrid approach was ultimately "impossible to implement without compromising user security and privacy", Apple's head of data protection subsequently admitted – in response to criticism from a US child protection initiative, which continued to call for the technology to be implemented. The lawsuit now once again accuses Apple of failing to implement the highly controversial project.

There has also been recent criticism from the UK: In contrast to competitors such as Google and Meta, Apple only reports a tiny number of CSAMs on its servers and invests too little in protective functions, according to a child protection organization. Apple itself recently referred to the nude filter integrated into its operating systems as a protective function, which automatically renders pornographic images unrecognizable and only shows them when tapped.

(lbe)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.