Police still using AI tool despite inaccurate evidence in Israeli football fan ban | UK News


At least 21 police forces across England are still using Copilot AI despite West Midlands Police (WMP) blocking Microsoft’s tool after inaccurate evidence formed a decision to ban Israeli football fans, Sky News can reveal.

The Birmingham force turned off access to the software after admitting, following initial denials, that a Copilot “hallucination” was responsible for a match that never happened being included in an intelligence document justifying excluding Maccabi Tel Aviv fans from Aston Villa in November.

And, at the weekend, MPs on the Home Affairs Select Committee highlighted fresh concerns about Copilot after saying it produced inaccurate key claims about past disorder around a contentious Maccabi match in Amsterdam in 2024.

How AI led police to ban Israeli football fans

Microsoft told Sky News it “continuously evaluates” Copilot and urges companies to review how they are using it.

Only eight of the forces across the UK who responded to our questions on their AI policy told us Copilot could not be used in investigations – including police in Scotland and Northern Ireland.

Our discovery that so many forces still allow officers to use Copilot reinforces a disjointed approach across the country and lack of coordinated policing.

That is despite the Maccabi ban escalating into one of the biggest policing controversies last year, eventually leading to the WMP’s chief constable, Craig Guildford, being forced out under government pressure.

West Midlands Police chief steps down

The National Police Chief’s Council told Sky News it “is confident that the potential benefits of using AI outweigh the risks posed, provided we remain committed and vigilant in using it correctly, responsibly and securely”.

Their AI experts advise forces to use Copilot “in the most appropriate way” – leaving it open to local decisions to be taken.

Greater Manchester Police, which is England’s second-largest police force, defended the use of AI, telling Sky News: “We have a robust AI policy in place to help promote the use of such technology to speed up processes and ensure officers have more time to be on the streets rather than behind their desks.”

West Yorkshire Police said staff are provided with “education and guidance on how to use it responsibly, which should avoid any issues”.

But it has taken the Maccabi controversy to highlight concerns about how AI is being used and whether the technology has been robustly tested enough before being approved.

A banner held by pro Israel supporters outside Villa Park. Pic: PA
Image: A banner held by pro Israel supporters outside Villa Park. Pic: PA

‘Significant shortcomings’

West Midlands Police and Crime Commissioner Simon Foster told Sky News: “I am concerned about the way in which WMP was utilising AI, not only in connection with this particular policing operation.

“Because plainly there were some significant concerns, shortcomings, and failures around ensuring there was a proper regulatory management of the use of AI in connection to this particular police operation.”

It emerged senior West Midlands officers were not clear about how AI generated erroneous evidence – highlighting wider concerns about how the technology has been used as a time-saving tool despite the risks.

The Home Affairs Select Committee found “proper due diligence was not applied”.

Mr Foster said: “We need to make sure that it is lawful, it is reasonable, it is ethically used and there’s a proper regulatory regime in place to ensure that it’s not misused, and it doesn’t throw up rogue results.”

Read more from Sky News:
Police investigating racist abuse of footballers
Team GB return home after record-breaking Winter Olympics

The police forces covering Northern Ireland and Scotland do not allow Copilot, while there are also blocks in place for North Wales and Dyfed-Powys forces.

But Chris Todd, chair of the National Police Data and Analytics Board and Humberside Police chief constable, insisted AI is “providing benefits to our communities” to join up data and reduce delays to stop criminals.

He said: “In Humberside Police we comply with the position of the National Police Chief’s Council Artificial Intelligence Portfolio, which has outlined their confidence in that the potential benefits of using AI outweigh the risks posed, provided we remain committed and vigilant in using it correctly, responsibly and securely.

“We echo that it should be used to support human decisions, not make them for us.”

Among those with a more cautious approach is the Cleveland force which doesn’t block Copilot but insists “the force doesn’t use AI to form intelligence or to assist with investigations”.

Police Scotland has been running a trial with Copilot since October involving a “limited number of police officers and staff” as they balance “ethical and human rights considerations” with duties to keep people safe.

The force said: “The trial does not involve any operational policing processes, and instead focuses on efficiencies in corporate processes, such as improving the retrieval of information across existing HR policies.”

Microsoft defended its software and pointed to differences between the 365 Copilot service for workplaces and the free Copilot consumer chat service for general use online.

Some police using Copilot admitted they use the chat product.

A Microsoft spokesperson said in a statement: “Microsoft 365 Copilot is grounded in an organisation’s own data, security, and access controls, works only with information a user already has permission to access, and provides citations, so sources can be reviewed and verified.

“We continuously evaluate and improve our services and encourage organisations to use Copilot within their own governance and review practices.”



Source link