Ahead of an upcoming appeal against the High Court ruling permitting police use of facial recognition technology (FRT), the Information Commissioner on 31 October published her first ever official opinion under the Data Protection Act.
Coming from the UK’s respected data supervisor, the document, tackling the use of FRT by law enforcement, will carry considerable weight.
The ICO’s opinion is being published at a time of considerable controversy for this ground-breaking technology. Its use has excited much criticism from privacy campaigners around the world, with the EU reportedly planning new regulation from next year and several US cities banning it outright.
UK law enforcement has been cautious, trialling the technology in various locations. In September, following a legal challenge to one such trial by South Wales Police, the High Court delivered its judgment on FRT, dismissing claims brought by Ed Bridges and Liberty that it breached the European Convention on Human Rights, data protection legislation and equalities laws.
In the aftermath of the High Court’s decision, the Home Office was quick to extol the crime-fighting virtues of the technology and its ability to ease pressure on resources to protect the community. With the release of her official opinion, however, the ICO has sought to rein-in initial enthusiasm that the Bridges judgment may have engendered amongst law enforcement agencies who saw the decision as a green light for the roll-out of FRT, and warns that headlong deployment of overly invasive technologies may damage trust in the UK’s model of policing by consent.
Emphasising the fact-specific nature of the Bridges ruling, the ICO’s opinion highlights the data protection obligations on the police when deploying FRT. To comply with data legislation, the technology’s use must be “strictly necessary”, a high bar the ICO says is more likely to be achieved where it is deployed in a targeted, intelligence-led and time-limited fashion in the pursuit of serious criminals rather than petty thieves.
Echoing a recent online blog abut FRT by the influential European Data Protection Supervisor, the ICO requires law enforcement to show why alternative less intrusive means could not be used each time it deploys FRT. Effectiveness – judged by reference to “demonstrable” public benefit such as arresting and convicting serious criminals – will apparently be a key justification for FRT use.
Without the gift of foresight, though, it is hard to see how law enforcement would ever be able to prove such public benefit until the FRT is up and running.
In a riposte to the High Court’s view in the Bridges judgment, the ICO asserts that “watchlists” – the databases of wanted individuals used by the FRT algorithms – are unlikely to meet requisite legal thresholds where they comprise individuals suspected of “non-serious” offences, custody images unlawfully retained by the police, or photographs from social media. Indeed, with regard to internet harvested images, when “deep fakery” is becoming increasingly sophisticated, the regulator’s anxiety to ensure that watchlists are of the highest quality and unquestionable provenance is understandable.
A companion report also issued by the ICO last Thursday, warns that the absence of national guidelines gives rise to inconsistent approaches between police forces, risking confusion, public concern and error. To avoid this, the data regulator urges the Government to bring forward as soon as possible statute-based and binding rules building on the advisory principles in the Surveillance Camera Code.
The ICO stance mirrors that of the House of Commons Science and Technology Committee which earlier this year expressed concern about the deployment of such ground-breaking technology in the absence of an adequate legal framework. The Select Committee went further, though, at the time calling for a moratorium on the deployment of FRT (a call the ICO would doubtless have found difficult to repeat faced with the carefully reasoned High Court decision in Bridges, albeit one that is subject to appeal).
While the ICO’s opinion is not directly relevant to private sector use of FRT, the regulator has confirmed it is investigating this too and foreshadows a further official opinion in due course. Given the concerns expressed by the data watchdog so far, particularly with regard to necessity and watchlist composition, it is arguably already possible to discern the direction of regulatory “travel” where private FRT use is concerned.
Early adopters of this game-changing technology may be disappointed at the ICO’s apparently lukewarm attitude. However, the regulator is expressly not seeking to curtail its use completely; rather by foreshadowing statutory regulation, it is advocating a more cautious approach aimed at maintaining confidence in high-tech policing in coming years.
In due course, such a gradualist path is likely to achieve longer term “buy-in” from legislators and the wider public, ensuring the crime-fighting benefits of the FRT are counter-balanced with due regard for concerns about its impact on privacy and the current limitations of the algorithms it employs.
Notwithstanding electoral pledges to increase the number of police officers, the ever-increasing pressure on law enforcement resources and the growing importance of the private sector in fighting crime generally surely means long-term solutions demand an acceptance of a role for private bodies in the deployment and use of this technology, and that an entirely separate solution for the police is unlikely to serve the wider public need.