Microsoft 365 Bug Exposed Customer Emails to Copilot AI, Raising Privacy Concerns

This article was generated by AI and cites original sources.

Microsoft has acknowledged a critical bug that allowed its Copilot AI to access and summarize confidential emails from customers without authorization. The bug, originally identified by Bleeping Computer, enabled Copilot Chat to read and outline email contents since January, even bypassing data loss prevention policies designed to protect sensitive information in Microsoft’s large language model.

Copilot Chat, an AI-powered chat feature available to Microsoft 365 subscribers using Office products like Word, Excel, and PowerPoint, was affected by this vulnerability, identified internally as CW1226324. This issue led to draft and sent emails labeled as ‘confidential’ being incorrectly processed by Microsoft 365 Copilot chat.

Microsoft has initiated the deployment of a fix for this bug earlier this month. However, the company has not disclosed the extent of the impact, with no comments provided on the number of affected customers. This incident raises concerns about data privacy and security within Microsoft’s ecosystem, highlighting the importance of stringent safeguards to prevent unauthorized access to sensitive information.

Source: TechCrunch