Reporting of abuse images: NGO criticizes Apple for "underreporting"
Discovered abuse material, known as CSAM, must be reported by the major network companies. A British child protection organization is now criticizing Apple.
- Ben Schwan
Apple is facing accusations in the UK of not reporting abuse material or not reporting it sufficiently. As the Guardian reported this week, the National Society for the Prevention of Cruelty to Children (NSPCC) is criticizing the iPhone manufacturer for allegedly engaging in massive "underreporting".
Significantly fewer cases than Meta and Google
According to data obtained by the NSPCC via the UK Freedom of Information Act, Apple is said to have reported a total of 337 "offenses of child abuse images" in England and Wales between April 2022 and March 2023. For the whole of 2023, only 267 cases of abuse material were reported to the US National Center for Missing & Exploited Children (NCMEC) for all platforms. This is significantly fewer than other large network companies report: Google reported 1.47 million cases and Meta even 30.6 million cases, the NCMEC annual report states.
"There is a worrying discrepancy between the number of abuse images perpetrated via Apple services in the UK and the almost negligible number of reports of abuse content to the authorities worldwide," said Richard Collard, Head of Online Child Protection at the NSPCC. Apple is "clearly lagging behind" the other tech companies here.
"Black hole" Apple
Apple did not want to answer questions from the Guardian, but referred to its guidelines. The company had decided not to introduce a program for direct CSAM scanning of iCloud photos after massive criticism because this could break encryption or have other negative effects on privacy protection. The child protection organization Heat Initiative from Los Angeles stated that Apple largely "does not recognize CSAM at all" in "the majority of its environments". Apple has "not invested sufficiently in Trust & Safety teams".
The NSPCC now fears that Apple's rollout of AI functions as part of Apple Intelligence could become a problem due to computer-generated images of abuse. However, the company is likely to use filters here and is proceeding much more cautiously than its competitors anyway. For example, it is not possible to generate photorealistic images, as the company has announced. According to the Heat Initiative, Apple is "a black hole" when it comes to CSAM tracking.
Empfohlener redaktioneller Inhalt
Mit Ihrer Zustimmmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.
Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.
(bsc)