
"Microsoft has confirmed that a bug allowed its Copilot AI to summarize customers' confidential emails for weeks without permission. The bug, first reported by Bleeping Computer, allowed Copilot Chat to read and outline the contents of emails since January, even if customers had data loss prevention policies to prevent ingesting their sensitive information into Microsoft's large language model. Copilot Chat allows paying Microsoft 365 customers to use the AI-powered chat feature in its Office software products, including Word, Excel, and PowerPoint."
"Microsoft said the bug, trackable by admins as CW1226324, means that draft and sent email messages "with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat." The tech giant said it began rolling out a fix for the bug earlier in February. A spokesperson for Microsoft did not respond to a request for comment, including a question about how many customers are affected by the bug."
Microsoft's Copilot Chat processed and summarized confidential draft and sent emails without permission, bypassing data loss prevention policies. The bug enabled Copilot Chat to read and outline email contents since January for paying Microsoft 365 customers using the feature in Office apps such as Word, Excel, and PowerPoint. The issue is trackable by admins as CW1226324 and causes messages labeled confidential to be incorrectly processed by Microsoft 365 Copilot chat. Microsoft began rolling out a fix earlier in February. Microsoft did not provide comment on how many customers were affected. The European Parliament blocked built-in AI features over cloud confidentiality concerns.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]