USA: Lawsuit against 16 deep nude websites

In an unusual move, the city of San Francisco is declaring war on AI-supported "undress" services. However, the operators are mostly based outside the USA.

Save to Pocket listen Print view
Woman on smartphone, visibly suffering

Women are mostly the victims of AI-generated pornography.

(Image: kei907/Shutterstock.com)

2 min. read

The city of San Francisco has filed a lawsuit against the operators of 16 websites that offer AI-assisted undressing of people in images. The defendants are accused of violating US laws and the laws of the state of California, which prohibit revenge pornography and child pornography, among other things, with the products known as deepnudes or deepfake pornography. The providers are also alleged to have violated unfair competition laws.

At first glance, it is not clear why the law firm of a major US city is bringing the lawsuit: most of the accused companies are based in Estonia, one in Serbia and others in Los Angeles, Santa Fe and the UK. In the case of various providers, it is still unclear who is behind them and where the operators are located. David Chiu, attorney for the city of San Francisco, explained in a press conference that the victims of such services – at least women and girls – are also located in California and that his office therefore has jurisdiction. Furthermore, it is not unusual for such lawsuits, which extend far beyond the city, to be initiated from this level.

In the first six months of this year alone, the websites complained about received over 200 million visits. Users upload images of clothed people and can have them "undressed" - usually for a fee. The aim of the lawsuit is to shut down such services, order the operators to pay fines, and permanently prevent their activities by deterring them. Such websites are a huge global problem and the victims are helpless.

The complaint refers to cases in schools where images of girls were circulated that were generated via such websites. There are also known cases of attempted blackmail, in which the victims are asked to pay money to prevent intimate images of them from being published. In some cases, the perpetrators simply used images that can be accessed on social networks.

Efforts to take action against such providers are already underway in other countries. The UK, for example, wants to criminalize the production of such images as well as their distribution.

(mki)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.