Messenger Signal criticizes Microsoft's recall

A screenshot function is to be introduced in Windows that makes content searchable for the AI. Signal does not want Microsoft to read through it.

listen Print view
Signal on the screen

(Image: Signal)

3 min. read

Signal does not want Microsoft's recall function to see screenshots of Messenger messages, and criticizes Microsoft for having to use a trick to prevent this. According to Signal, Microsoft should provide other options for developers to enable exceptions for the automated screenshots. Instead, they are introducing “screen security”. Screenshots remain empty.

The function in Windows is there so that Copilot, Microsoft's AI assistant, can answer users' questions and act as a helpful assistant that also knows what people are doing on the screen. To achieve this, screenshots are automatically taken, which the AI model can evaluate and process in the background. The first planned introduction of Recall was postponed after protests due to data protection concerns. However, it is now set to be integrated into Windows soon.

Videos by heise

Anyone using Signal on their desktop would then have to expect that private messages would also end up in the analyzed screenshots using Recall. However, the messenger prevents this from the outset. Protection is activated by default from Signal version 7.55. According to Signal, the Digital Rights Manager, which makes this possible, is actually intended to protect copyrighted content from screenshots, for example, so that you cannot film series on streaming services.

However, the function also prevents screenshots from being taken for other services – for example, as visual or reading aids for people with disabilities. To deactivate the actual protection, this must be selected in the settings.

Signal also explains that Screen Security only works for Windows and that a communication partner can take screenshots if necessary. The function is not available for macOS and Linux users anyway.

The fact that Microsoft does not offer any explicit options for developers to exclude their app from screenshots for data protection reasons and that Signal has to use a trick instead is a “glaring omission that limits our options”, says Joshua Lund from Signal in the blog post. “Operating system providers, especially those that deploy AI agents, must ensure that app developers like Signal always have the necessary tools and options to deny AI systems access to sensitive information in their apps at the operating system level.”

Signal President Meredith Whittaker also said at a recent conference: “There is a profound security and privacy issue around this hype of AI agents that ultimately threatens to break the 'blood-brain barrier' between the application layer and the OS layer by merging all these separate services and data.”

(emw)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.