More than 2,000 people were wrongly identified as possible criminals by facial scanning technology at the 2017 Champions League final in Cardiff.
South Wales Police used the technology as about 170,000 people were in Cardiff for the Real Madrid v Juventus game.
But out of the 2,470 potential matches with custody pictures - 92% - or 2,297 were wrong.
Chief Constable Matt Jukes said officers "did not take action" and no one was wrongly arrested.
South Wales Police have made 450 arrests in the last nine months using the automatic facial recognition (AFR) software, which scans faces comparing them to about 500,000 custody images.
The technology has helped the force convict a criminal to six years in prison for robbery while another got four-and-a-half years imprisonment for burglary.
But facial recognition was wrong on 92% of the faces it matched on the day of the Champions League Final, with 2,297 incorrect matches according to data on the force's website.
"We know that our major sporting events and our crowded places are potential terrorist targets, that's a reality," Mr Jukes said.
"So we need to use technology when we've got tens of thousands of people in those crowds to protect everybody, and we are getting some great results from that.
"But we don't take the use of it lightly and we are being really serious about making sure it is accurate."
However, all six matches to the police "watch list" at the Liam Gallagher concert in Cardiff in December were valid.
But the facial recognition accuracy has improved since the Champions League final - and risen to 28% overall.
The force was the first in the UK to make an arrest using the real-time technology last year in the days before the European showpiece.
South Wales Police said the high volume of false matches was down to "poor quality images" supplied by other agencies including UEFA and Interpol - and it being the first major use of the equipment.
Mr Jukes told BBC Wales facial recognition was vital at crowded events to protect people due to the threat of terrorism and had become increasingly accurate due to advancements in technology.
He said officers only took action when identification is confirmed.
"We are using it as a piece of intelligence and we assess that," he told BBC Wales.
"Everyone can take reassurance from the fact that we're certainly not going to simply take the technology and then take very strong action against people."
2️⃣0️⃣0️⃣0️⃣positive matches reached with our ‘Identify’ facial recognition technology in past 9️⃣ months with over 4️⃣5️⃣0️⃣ arrests. 🚓 https://t.co/iA7dEwRWkb— South Wales Police (@swpolice) May 4, 2018
But civil liberties campaign group Big Brother Watch called for the system to be scrapped, adding it was "outrageous" that more than 2,000 at the event had been wrongly identified.
"Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool," said director Silkie Carlo.
"The tech misidentifies innocent members of the public at a terrifying rate, leading to intrusive police stops and citizens being treated as suspects."