UK police blame Microsoft Copilot for intelligence mistake
Briefly

UK police blame Microsoft Copilot for intelligence mistake
""On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot [sic]," says Craig Guildford, chief constable of West Midlands Police, in a letter to the Home Affairs Committee earlier this week. Guildford previously denied in December that the West Midlands Police had used AI to prepare the report, blaming "social media scraping" for the error."
"Maccabi Tel Aviv fans were banned from a Europa League match against Aston Villa in November last year, because the Birmingham Safety Advisory Group deemed the match "high risk" after "violent clashes and hate crime offences" at a previous Maccabi match in Amsterdam. As Microsoft warns at the bottom of its Copilot interface, "Copilot may make mistakes." This is a pretty high profile mistake, though."
The chief constable of West Midlands Police admitted that Microsoft's Copilot generated a false West Ham v Maccabi Tel Aviv match, and that the hallucination was included in an intelligence report. West Midlands Police previously denied using AI and blamed social media scraping for the erroneous result. The fabricated match contributed to Maccabi Tel Aviv fans being banned from a Europa League match after Birmingham Safety Advisory Group deemed the fixture high risk following violent clashes and hate crime offences at a previous match in Amsterdam. Microsoft warns that Copilot may make mistakes, and independent testing found the assistant often produced incorrect or fabricated information. Microsoft did not respond to requests for comment.
Read at The Verge
Unable to calculate read time
[
|
]