Study: Twitter deletes deepfakes faster when reported as copyright infringements

According to a study, Twitter is apparently better at reporting non-consensually posted nude images or deepfakes than copyright infringements.

listen Print view
Deepfake lettering in front of cell phone in human hand

According to a study conducted at the University of Michigan, Twitter deletes deepfakes significantly faster if they are reported as copyright infringements.

(Image: Skorzewiak/shutterstock.com)

4 min. read

X is more likely to remove deepfakes if they are reported on the grounds of copyright infringement. This is what researchers at the University of Michigan found out. According to the study, X, formerly Twitter, only deleted content that was reported as non-consensual intimate images from the platform after several weeks or simply not at all.

As part of the study, the researchers posted 50 AI-generated nude photos on Twitter. They reported half of them as non-consensual depictions of nudity and the other half as copyright infringements. Twitter deleted all 25 of the images reported as copyright infringements from the platform within 25 hours. The accounts used to post the images were temporarily suspended.

The images that the researchers had reported as non-consensual depictions of nudity were still online three weeks after the report. The accounts that had posted them were neither blocked nor notified or warned in any way.

Videos by heise

Apparently, the Digital Millennium Copyright Act (DMCA) applies in the event of copyright infringement in the USA. The US law, which was passed in 1998, was intended to create a legal framework for copyright protection in the digital space. It requires platforms to respond quickly to reports of copyright infringements and delete the content in question after review. This is apparently enough of an incentive for Twitter to deal with the reports promptly.

There are also laws in individual states against the dissemination of non-consensual depictions of nudity. There are currently efforts to introduce a nationwide law. At the moment, Twitter and other platforms appear to lack incentives to block reports of certain depictions of nudity and deepfakes as quickly as copyright infringements.

However, according to the authors of the paper, it is not possible for every victim to have non-consensually published photos deleted online via a DMCA notification: the copyright to a photo is always held by the person who took the picture. If a photo was taken by someone else, the DMCA does not apply. In addition, according to 404 Media, you apparently have to provide relatively extensive information for such a notification. Although there are services that can be commissioned to issue such notifications, not every victim can afford the associated costs.

The authors of the study conclude that a law against the non-consensual dissemination of intimate personal content must encourage internet platforms to respond similarly quickly to such reports.

They cite the GDPR as a positive example: the General Data Protection Regulation has shaken up the way platforms have handled user data and content to date. The data protection requirements formulated therein are important steps in the right direction. According to the authors, the protection of intimate personal portrayals requires a similarly binding legal framework. A peer review of the study is still pending.

In the EU, non-consensual publications of nude images and deepfakes fall under the Digital Services Act, or DSA for short. The DSA has been fully in force since February 2024 and was implemented in Germany with the Digital Services Act. The law requires platforms to moderate such content in a timely manner.

The short messaging service run by Elon Musk had already come into contact with the law at the end of 2023. At that time, the EU Commission had initiated formal proceedings against the service, investigating whether X may have breached the DSA in the areas of risk management, content moderation and dark patterns, among others.

(kst)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.