ICO calls for statutory code on police use of facial recognition


Information Commissioner Office, UK, demands new statutory code to oversee the police use of 'invasive' facial recognition technology

The UK Information Commissioner Office (ICO) has demanded a new statutory code to oversee the police use of "invasive" facial recognition technology. The watchdog’s investigation follows the August incident over its use at King’s Cross station, in which it determined the technology was a potential threat to privacy.

The report comes after a court case was launched against South Wales Police, and 18 politicians signed a petition to stop its use, with many trials of the technology stopping nationwide.

In its findings, the report said that there are areas of data protection compliance where the Met Police and South Wales Police could improve practices, share lessons and reduce inconsistency.

"There have been missed opportunities to achieve higher standards of compliance and also to improve public awareness and confidence in the technology and its use for law enforcement purposes," the report said.

"The absence of a statutory code of practice and national guidelines contributes to inconsistent practice, increases the risk of compliance failures and undermines confidence in the use of the technology."

Despite over 50 deployments of the technology, in the case of South Wales Police, "there is no clear articulation of what the police consider to be ‘effective’ or at what point the piloting phase may end", it said. 

"This could lead to concerns overall about effectiveness and therefore whether the high number of trials over an extended period supports or undermines the necessity and proportionality case for its use," said the report.

Activist Ed Bridges, who took South Wales Police to court over the use of automatic facial recognition, lost his case in September. His crowdfunded appeal was the world’s first legal challenge over police use of facial recognition technology. 

In a blog post, Information Commissioner Elizabeth Denham said that the ICO had "serious concerns about the use of a technology that relies on huge amounts of sensitive personal information".

"We found that the current combination of laws, codes and practices relating to LFR (live facial recognition) will not drive the ethical and legal approach that’s needed to truly manage the risk that this technology presents," she said.

Denham added that she would ensure that everyone working in this developing area "stops to take a breath and works to satisfy the full rigour of UK data protection law".

"Moving too quickly to deploy technologies that can be overly invasive in people’s lawful daily lives risks damaging trust not only in the technology, but in the fundamental model of policing by consent. We must all work together to protect and enhance that consensus," she added.

Jason Tooley, chief revenue officer at Veridium, told SC Media UK that forces across the country halting facial recognition trials due to public backlash is a "huge step backwards and puts innovation at risk".

"There is increasing concern in the community that regulators such as the ICO will take too much of a heavy-handed approach to regulating the technology, and we must absolutely ensure innovation is not being stifled or stopped. It’s in the public interest for police forces to have access to innovative technology such as biometrics in order to deliver better services and safeguard our streets," he said.

The report comes days after police in Sweden was granted powers to deploy spyware on suspects’ devices to intercept communications and turn on cameras and microphones. These are part of a 34-point programme to enhance law enforcement powers. The decision was announced by Sweden's Interior Minister Mikael Damberg earlier this week.

Earlier, Sweden's Data Protection Authority (DPA) imposed a fine worth £18,000 on the Skelleftea municipality for breaking a privacy law after a local school  conducted a pilot using facial recognition to keep track of students' attendance.

"The Swedish DPA concluded that the test violates several articles in GDPR and has imposed a fine on the municipality of approximately EUR 20,000 (£18,000). In Sweden public authorities can receive a maximum fine of SEK 10 million (£0.8 million). This is the first fine issued by the Swedish DPA," the announcement said.

Herman DeGroot, Regional Director for Northern Europe at Securonix, told SC Media UK that the use of interception tools by law enforcement has been a strongly debated issue for a long time and poses many challenges. 

"Most notably, ensuring that these tools are only used for legitimate purposes. If interception tools are going to be used by law enforcement then regulations need to be implemented, such as limitations on the types of crimes that can be investigated using these techniques," he said.

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews