LOADING...
Microsoft Copilot bug lets AI read you private emails
The issue primarily affected business and enterprise users

Microsoft Copilot bug lets AI read you private emails

Feb 19, 2026
10:39 am

What's the story

Microsoft has confirmed a major privacy breach in its 365 Copilot tool, which allowed the AI to access and summarize users' private emails without their consent. The issue primarily affected business and enterprise users of Microsoft 365 Copilot features. The glitch was first reported by Bleeping Computer and mainly impacted Copilot Chat, an AI assistant integrated into Microsoft 365 apps like Outlook, Word, Excel, and PowerPoint.

Issue details

AI accessed emails in Sent Items and Drafts folders

The bug in question allowed Copilot to access emails from users' Sent Items and Drafts folders. In some cases, even those messages with confidentiality labels were accessed by the AI. These labels are usually meant to prevent automated access, but in this case, they didn't work as intended. Microsoft has confirmed that the problem was detected in January and tracked internally as a service issue.

Fix deployment

Fix for the bug was rolled out in early February

Microsoft has confirmed that it began rolling out a fix for the bug in early February. However, the company hasn't disclosed how many customers were affected by this issue or if any of the email content was stored beyond generating summaries. The tech giant also didn't clarify whether this problem affected all regions or only specific enterprise environments.

Advertisement

Tool function

Copilot Chat can summarize emails, generate documents

Copilot Chat is designed to help users summarize emails, generate documents, and answer questions using workplace data. It works by accessing files and emails within an organization to provide context-aware responses. This deep integration has raised serious privacy concerns, especially after the recent incident. The bug was tracked by admins as CW1226324 and allowed draft/sent email messages with a confidential label applied to be incorrectly processed by Microsoft 365 Copilot chat.

Advertisement

Cautionary measures

Incident raises red flags about AI tools accessing sensitive data

The incident comes as organizations and governments are becoming more cautious about AI tools accessing sensitive communication. Some have even blocked built-in AI features on official devices over data security concerns. Microsoft has said the bug is being fixed and admins can monitor the issue through its service dashboard. However, a spokesperson for the company did not respond to a request for comment on how many customers were affected by this bug.

Advertisement