In Brief
Posted:
6:44 AM PST · February 18, 2026
Image Credits:Rafael Henrique/SOPA Images/LightRocket / Getty ImagesMicrosoft has confirmed that a bug allowed its Copilot AI to summarize customers’ confidential emails for weeks without permission.
The bug, archetypal reported by Bleeping Computer, allowed Copilot Chat to work and outline the contents of emails since January, adjacent if customers had information nonaccomplishment prevention policies to forestall ingesting their delicate accusation into Microsoft’s ample connection model.
Copilot Chat allows paying Microsoft 365 customers to usage the AI-powered chat diagnostic successful its Office bundle products, including Word, Excel, and PowerPoint.
Microsoft said the bug, trackable by admins arsenic CW1226324, means that draught and sent email messages “with a confidential statement applied are being incorrectly processed by Microsoft 365 Copilot chat.”
The tech elephantine said it began rolling retired a hole for the bug earlier successful February. A spokesperson for Microsoft did not respond to a petition for comment, including a question astir however galore customers are affected by the bug.
Earlier this week, the European Parliament’s IT section told lawmakers that it blocked the built-in AI features connected their work-issued devices, citing concerns that the AI tools could upload perchance confidential correspondence to the cloud.
Subscribe for the industry’s biggest tech news















English (US) ·