iMessage: users will soon be able to report nude photos to Apple
How can end-to-end encryption and the fight against sexual abuse be reconciled? Apple is taking a new approach.
(Image: oasisamuel / Shutterstock.com)
Apple is integrating a reporting function for nude content into the crypto messenger iMessage. From iOS 18.2, it will be possible for the first users to report photos and videos recognized locally by the operating system as nude content directly to Apple, as reported by the Guardian. This is part of the "communication security" function integrated into all Apple operating systems, which is now activated by default for children under the age of 13. Parents can also set up the function on teenagers' devices and optionally activate it for themselves – or switch it off.
Users can forward nude photos and messages to Apple
If the system detects a nude photo received via iMessage, for example, it is automatically blurred and the recipient is warned about the sensitive content. At this point, there will be a new function in future to forward the images received to Apple. In addition to the respective photos or videos, "limited surrounding text messages" and the name or account of the whistleblower will also be sent to the company, according to an iPhone notification dialog published by the Guardian. Apple will review the content and possibly take action, it says. This includes blocking iMessage accounts and possibly also informing law enforcement authorities.
The new function will be introduced in Australia first, as new legal regulations for messaging and cloud services will soon apply there, notes the Guardian. However, the global introduction of the new reporting function is planned.
Videos by heise
Apple had originally positioned itself against the draft law in Australia (as well as in other countries), pointing out that it threatens the confidentiality of communications guaranteed by end-to-end encryption. The law now gives providers more leeway to report illegal content – without a backdoor for encryption.
Apple originally wanted to review iCloud photos on iPhones
In order to take better action against child sexual abuse material (CSAM – ), Apple had considered scanning iCloud photos locally on the iPhone several years ago – and automatically transmitting any CSAM content found to Apple in the background. After massive criticism from customers, security researchers and civil rights activists, the company stopped the project.
The child protection functions planned in parallel, such as the nudity filter, were revised and finally integrated into the operating systems. Users can click on such blurred images to view them anyway. – On devices used by children under the age of 13, from iOS 18 onwards, the – screen time code, which is ideally only known to parents, must also be entered.
Apple has recently been accused by various parties of doing too little to combat CSAM, especially in iCloud. Abusive material is also distributed there via shared albums, according to the accusation. A US class action lawsuit accuses the company of ignoring such material.
(lbe)