Signal has introduced a new privacy-focused update to its Windows application aimed at limiting system-level screenshot capture.
Highlights
The update brings a “Screen Security” setting that prevents screenshot tools from capturing content within the Signal app—part of a broader response to Microsoft’s AI-powered Recall feature.
Microsoft’s Recall, currently being tested through the Windows Insider Preview program, periodically captures screenshots of user activity to help users search and retrieve past actions via AI.
Although Microsoft has modified the feature to be opt-in and locally stored, it continues to raise concerns among privacy advocates due to the scope of data it collects, including potentially sensitive information from within other apps.
Signal’s Response
Signal’s new setting is enabled by default on Windows 11 systems and blocks screenshots at the system level. If a user attempts to capture the screen while the feature is active, the image will appear blank—preventing messages and app content from being recorded.
Signal users who prefer to disable the feature can do so manually by navigating to Settings > Privacy > Screen Security, where they’ll be prompted to confirm the change to avoid accidental deactivation.
In an effort to balance privacy and accessibility, Signal notes that this feature may interfere with screen readers or other assistive technologies. The company has provided the option to opt out for users relying on those tools.
Using DRM Flags to Prevent Captures
To implement this protection, Signal has leveraged Digital Rights Management (DRM) flags within its application—similar to how streaming platforms like Netflix prevent screen recording.
These flags prevent Recall and other system-level tools from accessing visual content displayed by the app window. While not originally intended for messaging applications, DRM in this case is being repurposed to protect users’ conversations.
Developer Concerns Over System-Level Features
Signal has publicly voiced its concerns about the need for workarounds like this. In a recent blog post, the organization emphasized that developers should not have to rely on such techniques to maintain user privacy.
“We hope that the AI teams building systems like Recall will think through these implications more carefully in the future,” the company wrote. “Apps like Signal shouldn’t have to implement a ‘one weird trick’ in order to maintain the privacy and integrity of their services without proper developer tools.”
This criticism reflects a broader industry conversation about how system-level features can impact user privacy—particularly when those features are AI-powered and operate in the background.
Continued Debate Over Data Transparency
Although Microsoft has made changes to Recall in response to public feedback—such as making the feature opt-in and allowing users to pause it—concerns remain about its potential for overreach.
Despite storing data locally and requiring user authentication, Recall continues to draw scrutiny due to its ability to collect visual data across multiple applications without granular user or developer control.
Signal’s update serves as a practical example of how application developers are adjusting their tools to adapt to this new era of persistent system-level data capture.
It also underscores the growing tension between AI-enabled operating system features and the privacy expectations of users and developers alike.
As operating systems integrate more AI features into their core functionality, the responsibility of ensuring user privacy is increasingly falling on individual app developers.
Signal’s proactive step may prompt broader discussions on the need for more transparent developer tools and better control mechanisms in future OS updates.