The company says it has addressed the issue and it "did not provide anyone access to information they weren't already authorised to see".
The company is working on deploying a fix for impacted users.
Microsoft said the bug meant that its Copilot AI chatbot was reading and summarizing paying customers' confidential emails, bypassing data protection policies.
Microsoft says a Microsoft 365 Copilot bug has been causing the AI assistant to summarize confidential emails since late ...
Microsoft 365 Copilot Chat reportedly allowed surfacing of confidential emails from users' Drafts and Sent folders due to a ...
ONLC Training shows how simple chat can power helpful AI workflows in Microsoft 365 Copilot Studio makes AI feel usable ...
Did your shopping list just get exposed? Or your location got tracked? This could very well happen to a Co-Pilot user. Recent news suggests that Co-Pilot AI is ...
Microsoft's Copilot AI assistant can read and summarize messages with tags specifically designed to prevent it from doing so.
Microsoft deployed a fix for the bug, which shows the hazards of using AI in the workplace.
Microsoft Copilot 2026 adds nine upgrades like GPT 5.2 and two thinking modes, helping you get faster answers or deeper ...
Microsoft has confirmed a bug that allowed Copilot AI to access users’ confidential emails without proper permission.
Microsoft Copilot Chat is a full-fledged, AI-powered chatbot and content creation tool, featuring the power of ChatGPT with commercial data protection from Microsoft. Microsoft Copilot Chat, powered ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results