Skip to main contentSkip to navigationSkip to navigation
Football fans pass through security gates before the match at the National Stadium in Cardiff.
Football fans pass through security gates before the match at the National Stadium in Cardiff. Photograph: Martin Rickett/PA
Football fans pass through security gates before the match at the National Stadium in Cardiff. Photograph: Martin Rickett/PA

Welsh police wrongly identify thousands as potential criminals

This article is more than 5 years old

South Wales force defends use of facial recognition technology at 2017 Champions League final

A police force has defended its use of facial recognition technology after it was revealed that more than 2,000 people in Cardiff during the 2017 Champions League final were wrongly identified as potential criminals.

South Wales police began trialling the technology in June last year in an attempt to catch more criminals. The cameras scan faces in a crowd and compare them against a database of custody images.

As 170,000 people arrived in the Welsh capital for the football match between Real Madrid and Juventus, 2,470 potential matches were identified.

However, according to data on the force’s website, 92% (2,297) of those were found to be “false positives”.

South Wales police admitted that “no facial recognition system is 100% accurate”, but said the technology had led to more than 450 arrests since its introduction. It also said no one had been arrested after an incorrect match.

A spokesman for the force said: “Over 2,000 positive matches have been made using our ‘identify’ facial recognition technology, with over 450 arrests.

“Successful convictions so far include six years in prison for robbery and four-and-a-half years imprisonment for burglary. The technology has also helped identify vulnerable people in times of crisis.

“Technical issues are common to all face recognition systems, which means false positives will be an issue as the technology develops. Since initial deployments during the European Champions League final in June 2017, the accuracy of the system used by South Wales police has continued to improve.”

The force blamed the high number of false positives at the football final on “poor quality images” supplied by agencies, including Uefa and Interpol, as well as the fact it was its first major deployment of the technology.

Figures also revealed that 46 people were wrongly identified at an Anthony Joshua fight, while there were 42 false positives from a rugby match between Wales and Australia in November.

All six matches at a Liam Gallagher concert in Cardiff in December were valid.

The chief constable, Matt Jukes, said the technology was used where there were likely to be large gatherings, because they were “potential terrorist targets”.

“We need to use technology when we’ve got tens of thousands of people in those crowds to protect everybody, and we are getting some great results from that,” he told the BBC. “But we don’t take the use of it lightly and we are being really serious about making sure it is accurate.”

The force said it had considered privacy issues “from the outset”, and had built in checks to ensure its approach was justified and proportionate.

However, the civil liberties campaign group Big Brother Watch criticised the technology.

In a post on Twitter, the group said: “Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool.”

Most viewed

Most viewed