High Court declares facial recognition legal...

Police use of facial recognition is legal, Cardiff high court rules

Ruling comes as London mayor acknowledges Met police role in its deployment in King’s Cross development

From The Guardian | Owen Bowcott First published on Wed 4 Sep 2019 11.18 BST

In his article, Owen Bowcott writes…

“Police use of automatic facial recognition technology to search for people in crowds is lawful, the high court in Cardiff has ruled.

Although the mass surveillance system interferes with the privacy rights of those scanned by security cameras, two judges have concluded, it is not illegal.

The legal decision came on the same day the mayor of London, Sadiq Khan, acknowledged that the Metropolitan police had participated in the deployment of facial recognition software at the King’s Cross development in central London between 2016 and 2018, sharing some images with the property company running the scheme.

That contradicted previous assurances about the relationship with King’s Cross given by the mayor, who asked the Met “as a matter of urgency” to explain what images of people had been shared with the developer and other companies.

Last month, King’s Cross became one of the first property companies to say it had used facial recognition software in two street cameras until 2018 for reasons of “public safety”, but following an outcry it said it had abandoned plansto deploy the controversial technology more widely on the site. 

The legal challenge in Cardiff was brought by Ed Bridges, a former Liberal Democrat councillor from the city, who noticed the cameras when he went out to buy a lunchtime sandwich. He was supported by the human rights organisation Liberty. He plans to appeal against the judgment.

Bridges said he was distressed by police use of the technology, which he believes captured his image while out shopping and later at a peaceful protest against the arms trade. During the three-day hearing in May, his lawyers alleged the surveillance operation breached data protection and equality laws.

The judges found that although automated facial recognition (AFR) amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.

Dismissing the challenge, Lord Justice Haddon-Cave, sitting with Mr Justice Swift, said: “We are satisfied both that the current legal regime is adequate to ensure appropriate and non-arbitrary use of AFR Locate, and that South Walespolice’s use to date of AFR Locate has been consistent with the requirements of the Human Rights Act and the data protection legislation.”

Responding to the judgment, Megan Goulding, a Liberty lawyer, said: “This disappointing judgment does not reflect the very serious threat that facial recognition poses to our rights and freedoms … It is time that the government recognised the danger this dystopian technology presents to our democratic values and banned its use.”

Bridges said: “South Wales police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent. This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”

Facial recognition technology maps faces in a crowd and compares them to a watch list of images, which can include suspects, missing people and persons of interest to the police.

The cameras scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.

Three UK forces have used facial recognition in public spaces since June 2015: the Met, Leicestershire and South Wales police.

Lawyers for South Wales police told the hearing facial recognition cameras prevented crime, protected the public and did not breach the privacy of innocent people whose images were captured.

The technology was likened to police use of DNA. Those not on a watch list would not have their data stored after being scanned by AFR cameras, the court was told.

The chief constable of South Wales police, Matt Jukes, said: “I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern. So, I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme.

“There is, and should be, a political and public debate about wider questions of privacy and security. It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”

A spokeswoman for the Information Commissioners’ Office, which intervened in the case, said: “We welcome the court’s finding that the police use of live facial recognition systems involves the processing of sensitive personal data of members of the public, requiring compliance with the Data Protection Act 2018.

“This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police.”

A survey of more than 4,000 adults released on Wednesday by the Ada Lovelace Institute found that a majority (55%) want the government to impose restrictions on police use of facial recognition technology but that nearly half (49%) support use of facial recognition technology in day to day policing, assuming appropriate safeguards are in place.

The Met said the ruling’s implications would be carefully considered before a decision was taken on any future use of live facial recognition technology.

Leicestershire police said they use facial recognition technology in criminal investigations, within locally agreed guidelines and legislation to identify possible suspects. “[It] was last used at a public event in 2015, as a pilot scheme and it has not been used in that way since,” the force said.”

Kim Thonger