Data regulator probes King’s Cross facial recognition tech

The facial-recognition system at King’s Cross is to be investigated by the UK’s data-protection watchdog.

Media exposure of live facial recognition at the site prompted the Information Commissioner’s Office (ICO) to look into how it was being used.

The ICO will inspect the technology in place and how it is operated to ensure it does not break data protection laws.

The regulator said it was “deeply concerned” about the growing use of facial-recognition technology.

Fair and transparent

The Financial Times was the first to report a live face-scanning system was being used across the 67-acre (0.3-sq-km) site around King’s Cross station in London.

Developer Argent said it used the technology to “ensure public safety” and it was just one of “a number of detection and tracking methods” in place at the site.

But the use of cameras and databases to work out who is passing through and using the site has proved controversial.

So far, Argent has not said how long it has been using facial-recognition cameras, what is the legal basis for their use, or what systems it has in place to protect the data it collects.

In its statement, the ICO said: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all.”

The regulator said it was keen to ensure that King’s Cross developer was using the technology in accordance with UK laws governing the use of data.

“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way,” said the ICO.

It must have documented how and why it believed its use of the technology was legal, proportionate and justified, it added.

Argent has not yet responded to a request for comment by BBC News.

The mayor of London is also quizzing developer Argent about its use of facial-recognition systems.

Sadiq Khan wrote to the company and said there was “serious and widespread concern” about the legality of facial recognition.

Let’s block ads! (Why?)

BBC News – Technology

King’s Cross developer defends use of facial recognition

The developer behind a 67-acre site in the King’s Cross area of central London has defended its use of facial recognition technology.

Under data protection laws, firms must provide clear evidence that there is a need to record and use people’s images.

A spokeswoman said the tool was used to “ensure public safety” and was one of “a number of detection and tracking methods”.

The local council said it was unaware that the system was in place.

It was first reported by the Financial Times.

In a statement, developer Argent said it used cameras “in the interest of public safety” and likened the area to other public spaces.

“These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public,” it said.

A spokeswoman declined to say what those systems were, how long the facial recognition had been in operation or what the legal basis was for its use, as is required under European data protection law.

Potential for inappropriate use

In addition to the National Rail, London Underground and Eurostar stations, King’s Cross is home to a number of restaurants, shops and cafes, as well as offices occupied by Google and Central Saint Martins college.

The college told the BBC it had “not been made specifically aware” that the tech was in use in the area and added that it does not use it inside its own buildings.

According to the King’s Cross website, planning permission for new additions to the site, granted in 2006, included:

  • 50 buildings
  • 1,900 homes
  • 20 streets
  • 10 public parks

The BBC has confirmed that London’s Canary Wharf is also seeking to trial facial recognition tools, as reported in the Financial Times.

The Information Commissioner’s Office (ICO) said it had general concerns about the potential for inappropriate use of the technology.

“Organisations wishing to automatically capture and use images of individuals going about their business in public spaces need to provide clear evidence to demonstrate it is strictly necessary and proportionate for the circumstances, and that there is a legal basis for that use,” it said in a statement.

“The ICO is currently looking at the use of facial recognition technology by law enforcement in public spaces and by private sector organisations, including where they are partnering with police forces.

“We’ll consider taking action where we find non-compliance with the law.”

South Wales Police faced a legal challenge to its use of facial recognition in 2018.

Despite this it is currently undergoing a three-month trial of a new app.

Chancellor Sajid Javid gave his backing to the police in their trials of facial recognition cameras last month, while he was home secretary.

However, privacy groups have also voiced concerns about the implications of facial recognition on privacy rights.

“Facial recognition is nothing like CCTV – it’s not an accurate comparison,” said Stephanie Hare, an independent researcher and tech commentator.

“It allows us to be identified and tracked in real time, without our knowledge or our informed consent.

“We recognise the power of DNA and fingerprints as biometrics and their use is governed very strictly under UK law. We do not apply the same protections and restrictions to face, yet it is arguably even more powerful precisely because it can be taken without our knowledge.”

Let’s block ads! (Why?)

BBC News – Technology