Facial recognition (FR) technology is being used increasingly around the globe. Is this a world of law and order or a privacy dystopia? Or both?
FR has long been a feature of Hollywood and TV: law enforcement agencies use it with great effect against terrorists and other criminals. The reality is more complex than this. For a start, FR is not as accurate as Hollywood portrays. Further, there are privacy concerns. And, of course, there are those who complain that the law doesn’t keep up with technology.
Widespread use in China
China has been using facial recognition for a while. All new mobile phone users must sign up to facial recognition and live FR is in use at airports and underground train stations. It has been used to identify and shame people wearing pyjamas to the shops. Then there are stories that the technology was updated to take into account people wearing masks during their Coronavirus outbreak earlier in the year to identify when people had a fever or weren’t wearing a mask. Live FR has even been used as form of surveillance of student movements.
…and the USA
Ok, it’s probably not unexpected that FR is widespread in China. After all, its not really a case of the law not keeping up with technology since China has a different cultural and legal approach to human rights. So perhaps it comes as more of a surprise to learn the USA has been using FR extensively too. IBM indicated they would abandon FR technologies completely and Microsoft and Amazon have indicated they will sell or will postpone sales of FR technologies to police departments. But that seems to have left other providers such as Clearview to have a clean sweep signing up more than 600 law enforcement agencies in the USA alone to use their FR. US senators don’t like this and decided to look into the issue and there have been calls for a federal law to govern use of FR. Oh and Twitter, Google and Facebook – who have had their own privacy wrangles – sent cease-and-desist letters to Clearview to stop them scraping billions of photos to help the police departments. So, law doesn’t appear to be keeping with technology in the USA. But then, there is no general federal law dealing with data privacy.
EDPB says no
So what about where GDPR applies? Well, the European Data Protection Board announced in July 2020 (PDF) that the use of a service such as Clearview AI by law enforcement authorities in the European Union “would, as it stands, likely not be consistent with the EU data protection regime.”
School registration in Sweden
A Swedish school used FR to register 22 students’ attendance as part of a short streamlining trial. The school estimated that rolling out the system school-wide would save 17,280 hours each year. To do this, the school obtained explicit consent from guardians, allowing them to refrain from participating in the trial. Also, it stored the biometric data locally without an internet connection. Despite this, in August 2019, the Swedish Data Protection Authority fined the school SEK 200,000 (GBP 17,600, USD 23,000) – quite a high fine in the circumstances. It said the school had breached GDPR in three ways. Firstly, it had processed personal data in a disproportionate manner to register attendance. Secondly, it had processed sensitive personal (biometrical) data without a legal basis. Finally, it had not undertaken a data protection impact assessment or consulted with the DPA.
What about here in the UK? Well, Heathrow Airport undertook a consultation with the Information Commissioner’s Office about their use of FR and the ICO published a report about this in July 2020 (PDF). Heathrow are in the process of automating part of the passengers’ journey through the airport by providing self service bag drop units and self boarding gates. Additionally, Heathrow were progressing with an initiative that allows passengers to prove their identity without having to show their identity document. This process would, of course, use FR. The key here was that Heathrow were looking to do this in a compliant manner and were seeking the consent of the passengers concerned. The ICO was largely supportive.
Private use by King’s Cross developer
There are other trials which don’t rely upon consent. For example, the ICO was less enthusiastic about the use of FR at a new development in King’s Cross in London, saying: “I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector.” The developer abandoned the scheme.
South Wales Police
The ICO was also following the case against South Wales Police. SWP started trials of live automatic FR for law enforcement purposes in 2017 at football matches, concerts and other large events. The technology scanned faces in crowds and compared them against a watch list of suspects. SWP said their use of FR saw them arresting 61 people for offences including robbery, violence, theft and court warrants. Liberty brought a claim alongside Mr Bridges whose face had been scanned while he was Christmas shopping in 2017 and at a peaceful protest in 2018. In August 2020, the UK Court of Appeal partially agreed with Liberty. It ruled there was no clear guidance on where the technology could be used and who could be put on a watchlist, with individual officers having too much discretion. Also, SWP’s data protection impact assessment was deficient. And SWP did not take reasonable steps to find out if the technology had a racial or gender bias. Liberty then called for SWP – and the Metropolitan Police who are also using FR – to stop using the technology. But crucially, the ruling did not prohibit use of the technology. If anything, it actually provided more clarity on how to use FR lawfully.
What can we learn?
Use of FR – coupled with artificial intelligence – is controversial. And yet, its use is growing around the globe and is unlikely to recede. The EDPB and the UK Court of Appeal indicate that broad use of FR without user consent might be unlawful. There have been calls, mirroring those in the US, for there to be a new EU law addressing this.
But the door has been wedged open for use of FR technologies in the EU. Set up your use of FR with GDPR in mind, undertaking a proper data protection impact assessment to identify if it is proportionate to the purpose, then you might be ok. Ideally, consult with the ICO first too just to be sure.
So, maybe the law can keep up with technology just fine, at least in this instance. And for those who say Brexit will see a bonfire of EU regulations in the UK changing this approach to use of FR, they will be disappointed. Provided the UK government keeps its promises and sticks to its international legal obligations – no longer a given these days – then GDPR will continue to apply in the UK.
If you need advice, contact me firstname.lastname@example.org or +44 (0) 20 7611 2338.