Whose face is it?

Debate on the moral and legal aspects of facial recognition surveillance have been going on around the world. Where does the individual affected stand?

Early in August, California-based air travel claims company ClaimCompass received an unusual request. 

A claim was filed by a passenger for a cancelled Lufthansa flight from Frankfurt to Sao Paulo under the EU 261 regulation. What was unusual about the claim request was a demand made by the airline: send a selfie of the passenger holding their passport.

"We received a claim for compensation under EC261/2004 Regulation for two passengers who were scheduled to fly on LH506 from Frankfurt to Sao Paulo on 18 July. The request to send a selfie of the passenger holding their passport was sent directly to us on 2 August," wrote Ivo Atanassov, who was then a partner relations manager at ClaimCompass, EU, in an email to SC Media UK. 

Suspecting a GDPR violation, the request was forwarded to the legal team of the company. Their suspicions were confirmed.

The company was violating specific GDPR norms by making the request. According to Article 4(14) of GDPR, facial images are considered as "biometric data", subject to additional requirements: under Article 9(1), processing of biometric data is prima facie prohibited unless the processing in question falls within one of the exceptions in Article 9(2).

Whose face is it?

Regulators are tightening the noose on the use of facial recognition data for both public and private organisations. The latest in the series of regulatory intervention was in Sweden, where the country’s Data Protection Authority (DPA) imposed a fine worth £18,000 on the Skelleftea municipality for breaking a privacy law.

"A school in northern Sweden has conducted a pilot using facial recognition to keep track of students' attendance in school. The test run was conducted in one school class for a limited period of time," said the Swedish DPA announcement.

"The Swedish DPA concluded that the test violates several articles in GDPR and has imposed a fine on the municipality of approximately EUR 20,000 (£18,000). In Sweden public authorities can receive a maximum fine of SEK 10 million (£0.8 million). This is the first fine issued by the Swedish DPA," it added.

"As the enforcement action in Sweden shows, facial recognition technology involves the use of biometric information for the purposes of uniquely identifying an individual. That means it is special category personal data and subject to stricter rules under data protection law," said Martin Sloan, partner at Brodies Solicitors.

The Swedish DPA has observed that the school has "processed sensitive biometric data unlawfully", without proper impact assessment or prior consultation with the regulator.

Is consent really free?

GDPR mandates that organisations have to request the consent from every person scanned and prove that these individuals were fully informed and they were not coerced or lured into giving it.

Article 9(2)(a) of the regulation states that collection and processing of biometrics data including facial recognition is valid when "the data subject has given explicit consent to the processing of personal data".

According to the ClaimCompass legal team, this section does not hold water in the Lufthansa situation. Article 7(4)  states that utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract. 

"In other words, requesting biometric data is prohibited unless specific consent is granted because that data is absolutely necessary for the performance of the contract," wrote Ivo.

The Swedish school has based the processing on consent. However, the Swedish DPA deemed that it was not a valid legal basis, as there was a "clear imbalance between the data subject and the controller".

It is unlikely that organisations will be able to rely upon consent, observed Sloan. 

"Consent cannot be implied from simply entering a public place or a building, or relied upon where there is an imbalance of power (for example, students in a school or employees in their place of work). Organisations need to identify another legal basis for the use of facial recognition technology," the solicitor said.

"Organisations also need to take into account the principles in relation to proportionality. Again, this presents challenges when using facial recognition technology. In the Swedish case, the regulator concluded that it was not necessary to use facial recognition technology for the purposes of monitoring attendance."

Rights and responsibilities

Any organisations collecting biometrics data, including facial recognition details have an obligation to demonstrate compliance with data protection law, said Sloan.

"The Swedish case also highlights the importance of carrying out a data protection impact assessment to ensure that the deployment of facial recognition technology is lawful and that the privacy risks are identified and mitigated. If there is still a high risk to individuals, as there was in the Swedish school pilot, then the organisation should  consult with the Information Commissioner’s Office (ICO)," he said.

The ICO last month initiated an enquiry about the facial-recognition system installed by real estate developer Argent LLP in London’s King's Cross area. The real estate developer has cited the interest of public safety as the reason for setting up the facial recognition system. However, their actions still come under the GDPR ambit as only the law enforcement agencies are exempt, said Sloan.

"The use of facial recognition technology by law enforcement agencies is not subject to GDPR. Instead it is regulated by the Data Protection in Law Enforcement Directive, which was implemented in the UK through the Data Protection Act 2018," he said. 

The law is based on the general principles of GDPR on the processing of personal data, and the law enforcement agencies have an obligation to provide individuals with certain information. 

"Provided the processing is lawful, individuals have limited rights in relation to processing by law enforcement agencies. For example, the right to be forgotten does not apply. If an individual feels that the use of facial recognition technology is unfair, they can complain to that organisation or to the Information Commissioner’s Office," said Sloan.

Meanwhile, activist Ed Bridges, who took South Wales Police to court over the use of automatic facial recognition (AFR) has lost his case. His crowdfunded appeal was the world’s first legal challenge over police use of facial recognition technology.

In the case of Lufthansa, ClaimCompass legal team says the customers have the right to say no to the company’s demand for a selfie.

"Our legal department sent a statement (to Lufthansa) advising that the requested documentation is not required in order to process the claim as defined in EC261 and that such requests violate the GDPR. We have not received a response yet," wrote Ivo in the email.

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews