INC-24-0019 confirmed high Near Miss Windows Recall: Security and Privacy Flaw (2024) (2024)
Microsoft developed and deployed Microsoft Windows Recall, harming Windows users who would have been exposed to unencrypted screenshot storage ; possible contributing factors include inadequate access controls and insufficient safety testing.
Incident Details
| Date Occurred | 2024-05 |
| Severity | high |
| Evidence Level | primary |
| Impact Level | Sector-wide |
| Failure Stage | Near Miss |
| Domain | Privacy & Surveillance |
| Primary Pattern | PAT-PRI-001 Behavioral Profiling Without Consent |
| Regions | north america, united states, europe |
| Sectors | Technology |
| Affected Groups | General Public |
| Exposure Pathways | Direct Interaction |
| Causal Factors | Inadequate Access Controls, Insufficient Safety Testing |
| Assets & Technologies | Large Language Models |
| Entities | Microsoft(developer, deployer) |
| Harm Type | rights violation |
Microsoft's Windows Recall stored user screenshots in a plaintext database. Researchers found the flaw before launch, forcing a delay, encryption, and opt-in consent.
Incident Summary
In May 2024, Microsoft announced Windows Recall as a flagship feature of its new Copilot+ PC platform. The feature was designed to take screenshots of user activity every few seconds and index the captured content using on-device AI models, enabling users to search their past activity through natural language queries.[1]
Before the planned June 2024 launch, security researcher Kevin Beaumont and others discovered that the initial implementation stored all captured screenshot data in a plaintext SQLite database with no encryption, no access controls, and no authentication requirements. Any local user account or malware with basic file system access could read the entire history of a user’s activity.
The discovery triggered significant public and regulatory backlash. Microsoft subsequently delayed the launch, redesigned the feature to require opt-in consent and Windows Hello authentication, and added encryption to the stored data.
Key Facts
- Feature design: Continuous screenshots captured every few seconds, indexed by on-device AI
- Security flaw: All data stored in plaintext SQLite database accessible to any local process
- Discovery method: Independent security research prior to general availability
- Regulatory response: UK Information Commissioner’s Office requested clarification from Microsoft
- Outcome: Launch delayed; feature redesigned with opt-in consent, biometric authentication, and encryption
- Failure stage: Near-miss — vulnerability identified and addressed before widespread deployment
Threat Patterns Involved
Primary: Behavioral Profiling Without Consent — Continuous screenshot capture and AI indexing of all user activity constitutes comprehensive behavioral profiling, initially designed as opt-out with inadequate security controls
Significance
This incident illustrates the tension between AI-powered productivity features and fundamental privacy and security requirements. Key implications include:
- Privacy-by-design failures — The initial implementation prioritized functionality over security, storing sensitive behavioral data without basic protections
- Pre-deployment scrutiny — Independent security research identified the vulnerability before widespread harm occurred, demonstrating the value of external review
- Regulatory attention — The incident prompted regulatory inquiries and contributed to broader scrutiny of AI features that continuously monitor user behavior
- Industry precedent — Microsoft’s reversal from opt-out to opt-in established expectations for consent in AI-powered monitoring features
Timeline
Microsoft announces Windows Recall at Build conference as a flagship Copilot+ PC feature
Security researcher Kevin Beaumont demonstrates plaintext SQLite database storing all captured screenshots
Microsoft delays Recall launch, announces it will be opt-in with encryption and authentication requirements
Outcomes
- Regulatory Action:
- UK Information Commissioner's Office sought clarification from Microsoft
- Other:
- Feature delayed from June 2024 launch; redesigned with opt-in consent, Windows Hello authentication, and encrypted storage
Use in Retrieval
INC-24-0019 documents Windows Recall: Security and Privacy Flaw (2024), a high-severity incident classified under the Privacy & Surveillance domain and the Behavioral Profiling Without Consent threat pattern (PAT-PRI-001). It occurred in North America, United States, Europe (2024-05). This page is maintained by TopAIThreats.com as part of an evidence-based registry of AI-enabled threats. Cite as: TopAIThreats.com, "Windows Recall: Security and Privacy Flaw (2024)," INC-24-0019, last updated 2026-04-13.
Sources
- DoublePulsar Security Analysis of Windows Recall (primary, 2024-06)
https://doublepulsar.com/microsoft-recall-on-copilot-pc-testing-the-security-and-privacy-implications-ddb296093b6c (opens in new tab)
Update Log
- — First logged (Status: Confirmed, Evidence: Primary)
- — SERP optimization: shortened title, rewrote meta description for CTR