Abusive images in iCloud: US class action lawsuit accuses Apple of inaction
Apple ignores abuse material in iCloud and thus harms the victims, according to the accusations. A new scanning mechanism is also being called for.
(Image: Sebastian Trepesch)
Apple's refusal to scan devices and iCloud for child abuse material has led to a new lawsuit in the USA: Apple is accused of knowingly offering defective products that allow child sexual abuse material ( – CSAM) to be distributed unhindered. This leads to permanent damage to the victims and is grossly negligent.
Certification as a class action is required, and the amount of damages could therefore reach into the billions, explains the New York Times ("Amy" and "Jessica" vs. Apple, case number 5:24-cv-8832, United States District Court Northern District Of California).
Criticism of Apple's inaction
Similar to a lawsuit filed in August, a chat between an Apple manager and a colleague in which the latter writes that iCloud is the "best platform for distributing child pornography etc." is once again being cited.
Unlike other IT giants, Apple does not rely on large-scale server-side scans of iCloud content using techniques such as PhotoDNA, and iCloud photos can now be fully encrypted (end-to-end encryption). According to the lawsuit, this is a failure on the part of the company. The victims describe a "never-ending nightmare".
Videos by heise
Apple did not comment directly on the allegations, but a spokesperson emphasized to the New York Times that the company "fights such crimes without compromising the security and privacy of all customers". Apple is focusing on protective features to prevent the spread of CSAM before it starts.
Apple's scrapped CSAM scanning project
The new lawsuit also accuses Apple of deliberately discontinuing its own CSAM detection project. A good three years ago, Apple submitted a technology to scan for abuse material directly on iPhones & Co when uploading to the iCloud. After massive criticism from data protectionists, security researchers, civil rights activists and customers, the project was ultimately buried. The approach was "impossible to implement without compromising user security and privacy", Apple's head of data protection admitted at the time. The class action lawsuit also wants to force the company to introduce new CSAM scanning mechanisms – on devices such as services.
(lbe)