London Met Police live facial recognition system raises privacy, security concerns

News by Chandu Gopalakrishnan

Rights groups raise concerns about the legality of London Met Police’s surveillance software and its impact on privacy

The London Metropolitan Police is to begin using live facial recognition (LFR) technology on a standard basis. Privacy groups have raised concerns about the legality of the surveillance software and its impact on privacy.

“As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. Independent research has shown that the public support us in this regard. Prior to deployment we will be engaging with our partners and communities at a local level,” assistant commissioner Nick Ephgrave said in the Met Police statement on the development.

The LFR will be initially deployed at locations where serious offenders are frequently spotted, said the announcement. “Each deployment will have a bespoke ‘watch list’, made up of images of wanted individuals, predominantly those wanted for serious and violent offences.”

“Equally I have to be sure that we have the right safeguards and transparency in place to ensure that we protect people’s privacy and human rights. I believe our careful and considered deployment of live facial recognition strikes that balance,” assistant commissioner Ephgrave added in the statement.

However, that assurance seems to have failed in alleviating privacy concerns.

Civil and human rights organisation Liberty, which represented activist Ed Bridges in his failed attempt to stop South Wales Police (SWP) from using automatic facial recognition (AFR), called the development a “sinister step” that will push the UK into a “surveillance state”.

“This is a dangerous, oppressive and completely unjustified move by the Met. Facial recognition technology gives the State unprecedented power to track and monitor any one of us, destroying our privacy and our free expression,” said Liberty’s advocacy director Claire Coolier in a statement.

“Rolling out a mass surveillance tool that has been rejected by democracies and embraced by oppressive regimes is a dangerous and sinister step. It pushes us towards a surveillance state in which our freedom to live our lives free from State interference no longer exists.”

Privacy groups say that the Metropolitan Police has been using live facial recognition in public for years with no public or parliamentary debate.

Ed Bridges last year lost his case against South Wales Police over the use of AFR. His crowdfunded appeal was the world’s first legal challenge over police use of facial recognition technology.

The court went with the police’s claim that the AFR apparatus is placed in public not a form of covert surveillance that would contravene Regulation of Investigatory Powers Act 2000, which states that "surveillance is covert if, and only if, it is carried out in a manner that is calculated to ensure that persons who are subject to the surveillance are unaware that it is or may be taking place".

The plan is announced at a time when the European Union is considering a possible five-year ban on the use of facial recognition technology while it figures out how to properly regulate it.

It was reported earlier that parliamentary committees have called for a pause in the use of the technology until a statutory footing is imposed on them, but both the Metropolitan Police and the Information Commissioner's Office maintains that using the technology is lawful.

VPNMentor last year found out that a security tool named Biostar 2, used by the Metropolitan Police among others, allowed access to biometric data including more than a million fingerprints.

“The concerns over the security of biometrics are not altogether unjustified, with fears being raised over hackers gaining access to sensitive data and regulators admitting they need more time to work out how to prevent the technology being abused,” commented Simon Wood, CEO of Ubisecure.

There are several ways for criminals to circumvent facial recognition surveillance, including the use of deepfakes, said Steve Povolny, head of McAfee Advanced Threat Research. 

“Enhanced computers can rapidly process numerous biometrics of a face, and mathematically build or classify human features, among many other applications. While the technical benefits are impressive, underlying flaws inherent in all types of models represent a rapidly growing threat,” he explained.

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews