There is growing pressure for more details about the use of facial recognition in London’s King’s Cross to be disclosed after a watchdog described the deployment as “alarming”.
Developer Argent has confirmed it uses the technology to “ensure public safety” but did not reveal any details.
It raises the issue of how private land used by the public is monitored.
The UK’s biometrics commissioner said the government needed to update the laws surrounding the technology.
Argent is responsible for a 67-acre site close to King’s Cross station.
While the land is privately owned, it is widely used by the public and is home to a number of shops, cafes and restaurants, as well as considerable office space with tenants including Google and Central Saint Martins College.
There had been nothing to suggest that facial recognition was in use until the fact was revealed by the Financial Times.
UK biometrics commissioner Prof Paul Wiles has called for the government to take action over the use of facial recognition technology by the private sector as well as by law enforcement.
Facial recognition does not fall under his remit because current legislation only recognises DNA and fingerprints as biometrics.
While Argent has defended its use of the technology, it has repeatedly declined to explain what the system is, how it is used or how long it has been in operation.
- King’s Cross developer defends facial recognition
- Biased and wrong? Facial recognition tech in the dock
“I have no idea what they’re trying to do in King’s Cross,” Prof Wiles told the BBC.
“There’s no point in having facial-matching tech unless you are matching it against some kind of database – now what is that database?
“It’s alarming whether they have constructed their own database or got it from somewhere else.
“There is a police database which I very much hope they don’t have access to.”
“Historically an area like that would have been public space governed by public control and legislation,” Prof Wiles added.
“Now a lot of this space is defined as private but to which the public has access.”
Silkie Carlo, director of civil liberties group Big Brother Watch, said she had identified an Avigilon H4 camera at King’s Cross which, according to its website, comes with “a sophisticated deep learning artificial intelligence (AI) search engine for video” enabling the rapid identification of a specific person or vehicle.
Camden Council told the BBC it was unaware of the tech in use at King’s Cross and another regional council said it would be a matter between a private developer and the information commissioner.
Facial recognition officially falls under the information commissioner’s office under its remit to police data privacy.
The ICO has expressed concerns about its use and under European data protection law GDPR, firms must demonstrate they have a “legal basis” for adopting it.
Pace of change
Others have called for a change in the law but there is a sense of frustration about the challenge of generating that debate at government level.
Prof Wiles says he has only been granted one meeting with a minister in the three years since his appointment as biometrics commissioner.
Tony Porter, the surveillance camera commissioner, said he had made “repeated calls” for regulation to be strengthened.
Last month, MPs on the Commons Science and Technology Committee called for the police and other authorities to stop using live facial recognition tools, saying it had concerns about accuracy and bias.
“We need to have laws about all biometrics including ones we haven’t even thought about yet,” said Stephanie Hare, an independent researcher.
“We need to future-proof it. We need to discuss hugely its role in the private sector. The police and the government is one thing, we need to know if the private sector is allowed to do this and if so, under what conditions?”