No opt-in: iOS 18 sends image data to Apple without being asked
To allow personal photos to be searched for landmarks, iOS synchronizes image data with Apple servers. An opt-out apparently does not undo this.
(Image: Sebastian Trepesch)
A tacit feature introduced by Apple with iOS 18 and macOS 15 for advanced photo analysis has been met with criticism. In order to be able to search the photo library for landmarks, iPhones, iPads and Macs transmit certain image data to Apple servers without being asked after installing the current operating system version. This is activated by default; Apple calls the feature "advanced visual search".
He never asked for such an "enrichment" of his photos stored locally on the device, complained developer Jeff Johnson, who accidentally stumbled across the corresponding setting on his iPhone. Apple had made this decision "silently and without consent". Johnson's post sparked a debate on social media about Apple's data protection practices and led to some misleading media reports in which the company was accused of tapping into user photos.
Apple refers to data protection precautions
Apple's own overview of "all new features" of iOS 18 does not mention the "advanced visual search". The new feature is briefly documented in the privacy policy for the Photos app: "Your device confidentially matches locations in your photos to a global index maintained on Apple servers. Apple uses homomorphic encryption and differential privacy. In addition, an OHTTP relay is used that hides IP addresses. This means that Apple does not receive any information about the content of your photos," the company promises. An entry in Apple's blog on machine learning published in October explains how this is implemented technically.
Videos by heise
Apple's operating systems previously analyzed image content purely locally on the devices. Users can use this to search for specific image elements ("bicycle") and scenes ("seashore"), for example. The existing visual search provides additional information on image content and attempts to identify dog breeds, landmarks, plants and insects, for example. However, this only happens when the photo is opened. The "advanced visual search" now compares image data with databases on Apple's servers for the first time. Homomorphic encryption makes it possible to calculate with the encrypted data, meaning that it does not have to be decrypted on Apple's servers.
Criticism of the lack of external verification
This function apparently also works if photos are only stored locally on iPhone & Co and not synchronized via iCloud. One of the problems appears to be that Apple has not provided an opt-in for this. Users can only switch it off later if they know where the switch is ("Settings > Apps > Photos > Advanced visual search" in iOS / iPadOS 18 and in the "General" tab of the macOS Photos app settings).
Furthermore, the opt-out only applies to photos that are subsequently taken; the existing photo library has already been analyzed in this way.
This is not the way to introduce a privacy-friendly product if you have "good intentions", commented cryptologist Matthew Green on Hacker News. Before such a function is activated by default, external security researchers should be given the opportunity to check Apple's data protection promises.
(lbe)