Google search will soon suppress more unwanted intimate photos
Intimate photos and videos published without consent should appear less frequently in Google search results. A British service is behind this.
(Image: M-Production/Shutterstock.com)
Intimate images that are posted online without the consent of the person depicted are to be made more difficult to find. To this end, Google is now cooperating with the British Revenge Porn Helpline. Images registered on their website StopNCII.org will no longer appear in Google's search results. This is to be implemented in the coming months.
Google itself already operates a website where those affected can request the censorship of information concerning them. This concerns nude images, sexual material, personal information, content from blackmail websites and generally depictions of minors, even if they are not in intimate or compromising situations. However, registrations there can only affect Google.
StopNCII (Non-Consensual Intimate Images) collects hashes of intimate photos or videos that someone claims to be depicted without consent. Deepfakes are treated like real images. StopNCII shares the hashes with several partners, including Meta Platforms, Microsoft, Onlyfans, Pornhub, Reddit, Snap and Tiktok. All partners promise to stop showing registered photos or videos. Now Google is also joining in.
Minimum age 18
However, StopNCII does not allow hashes of depictions of minors; a minimum age of 18 applies at the time the image is taken. In the case of AI-generated images, the person shown must appear to be at least 18 years old. Anyone affected by being depicted as a minor should contact the Take It Down project of the British National Center for Missing and Exploited Children (NCMEC).
Videos by heise
As far as can be seen, there is no prevention of abuse at StopNCII. Erotic website operators could, for example, try to suppress the distribution of competing images. The hashes are uploaded anonymously and StopNCII never receives the images themselves. This means that the service cannot check whether the advertiser is really depicted and affected.
(ds)