Deployment of facial recognition has received another “endorsement” in the UK, during an event co-organized by the Tony Blair Institute for Global Change – including by London’s Metropolitan Police Director of Intelligence Lindsey Chiswick.

The Future of Britain Conference 2024 is co-hosted by My Life My Say charity with links to UK authorities, and the US embassy in London.

Despite civil rights groups like Big Brother Watch consistently warning against turning the UK’s high streets into a facial recognition-powered privacy and security nightmare, Chiswick was upbeat about using this technology.

She shared that the results so far have been “really good,” and asserted that this is happening as the Met are “conscious” of privacy concerns, which is far from any pledge that those concerns are being properly addressed – the police are simply aware of them.

Perhaps in line with that attitude, she conveniently left out the fact that the system is essentially a dragnet, scanning the faces of hundreds of thousands of law-abiding citizens in search of a very small number of criminal offenders – sometimes just to make a single arrest.

But while Chiswick directs citizens to the Met website where they can see “transparency” in action – explanations of the legal mandate, and “all sorts of stuff” – she insists that this transparency is much better than what private companies who use the same tech offer.

The idea seems to reassure the public not by stating – that we respect your privacy and rights and are open about how – but, “we’re less bad than the other guys.”

According to Chiswick, facial recognition opens up a number of “opportunities” (in the mass surveillance effort) – such as crime pattern recognition, traffic management, forensic analysis, body body-worn video analysis.

This high-ranking Met official came across as a major proponent and/or apologist of the controversial tech, describing it as a “game changer” that has already made a “huge difference” in how London is policed.

Chiswick goes into the various techniques used to try to match images (taken by surveillance equipment, and from other sources) – one of them being the most contentious – live facial recognition.

She promises that the “bespoke watch list” against which live camera feed images are compared is “not massive.”

“That’s being created over time. So it’s bespoke to the intelligence case that sits behind the deployment,” Chiswick said. “If an offender walks past the camera and there’s an alert, that’s a potential match.”