iCloud: Apple sued by US state over child abuse material
West Virginia's Attorney General wants Apple to scan iCloud material more for so-called CSAM. A lawsuit is now being filed.
iCloud logo: Apple once wanted to scan iPhones for CSAM, but backed down after civil rights protests.
(Image: Apple)
The pressure on Apple to scan iPhones and iCloud for child abuse material is growing. Now the US state of West Virginia has sued the company for allegedly failing to report sufficient CSAM content – that stands for “Child Sexual Abuse Material”. Apple has been “doing nothing about it for years,” according to Attorney General JB McCuskey. The lawsuit states that CSAM is “protected” within Apple's ecosystem. McCuskey said in a statement that Apple's “protection of privacy for child molesters” is “absolutely unforgivable.” The company is refusing to do the morally right thing and is allowing victims to be re-traumatized by the storage and distribution of this material.
Meta reports more
The lawsuit cites, among other things, internal emails from Apple in which employees write that iCloud is “the largest platform for the distribution of child pornography.” It also points out that Apple makes significantly fewer reports to CSAM reporting agencies in the US than, for example, Meta. However, these are open social media platforms, while iCloud holds private content.
Videos by heise
In 2021, Apple attempted to implement new protective measures against CSAM. At the time, plans were made to scan photos directly on the iPhone for such material. The company had developed what it considered a privacy-friendly model, but there were massive protests from civil rights activists and data protection advocates. This led to a rollback. McCuskey now wants such a function to be reintroduced.
Topic Chat Control
The lawsuit in West Virginia is not the only action against Apple in this area. For example, a US class-action lawsuit has been ongoing against Apple since 2024, accusing the company of inaction against CSAM. Pressure is also being exerted in the European Union. Attempts were made here to implement so-called chat control, where content in WhatsApp, iMessage, and Co. was to be checked for CSAM before sending. Here too, there were protests, as it would be possible at any time to check for other material and users felt their privacy was being compromised.
However, the issue is still not off the table. Apple has not yet commented on the lawsuit in West Virginia, which could result in high fines. McCuskey argues, among other things, that Apple is not an “unaware, passive distributor” of CSAM.
Empfohlener redaktioneller Inhalt
Mit Ihrer Zustimmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.
Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.
(bsc)