UK police blame Microsoft Copilot for intelligence mistake


The chief constable of certainly one of Britain’s largest police forces has admitted that Microsoft’s Copilot AI assistant made a mistake in a soccer (soccer) intelligence report. The report, which led to Israeli soccer followers being banned from a match final yr, included a nonexistent match between West Ham and Maccabi Tel Aviv.

Copilot hallucinated the sport and West Midlands Police included the error in its intelligence report. “On Friday afternoon I grew to become conscious that the inaccurate consequence regarding the West Ham v Maccabi Tel Aviv match arose as results of a use of Microsoft Co Pilot [sic],” says Craig Guildford, chief constable of West Midlands Police, in a letter to the House Affairs Committee earlier this week. Guildford previously denied in December that the West Midlands Police had used AI to arrange the report, blaming “social media scraping” for the error.

Maccabi Tel Aviv followers have been banned from a Europa League match towards Aston Villa in November final yr, as a result of the Birmingham Security Advisory Group deemed the match “excessive danger” after “violent clashes and hate crime offences” at a earlier Maccabi match in Amsterdam.

As Microsoft warns on the backside of its Copilot interface, “Copilot could make errors.” It is a fairly high-profile mistake, although. We examined Copilot just lately, and my colleague Antonio G. Di Benedetto discovered that Microsoft’s AI assistant usually “acquired issues mistaken” and “made stuff up.”

We’ve reached out to Microsoft to touch upon why Copilot made up a soccer match that by no means existed, however the firm didn’t reply in time for publication.



Supply hyperlink