King’s Cross: ‘Facial recognition last used in 2018’

Facial-recognition technology has not been used at London’s King’s Cross Central development since March 2018, according to the 67-acre (0.3-sq-km) site’s developer.

When the use of the technology was initially reported, by the Financial Times in August, a spokeswoman said it was to “ensure public safety”.

The partnership now says only two on-site cameras used facial recognition.

They had been in one location and had been used to help the police, it added.

According to a statement on its website, the two cameras were operational between May 2016 and March 2018 and the data gathered was “regularly deleted”.

The King’s Cross partnership also denied any data had been shared commercially.

It had used it to help the Metropolitan and British Transport Police “prevent and detect crime in the neighbourhood”, it said.

But both forces told BBC News they were unaware of any police involvement.

It said it had since shelved further work on the technology and “has no plans to reintroduce any form of FRT [facial-recognition technology] at the King’s Cross estate”.

However, as recently as last month, a security company was advertising for a CCTV operator for the area.

The duties of the role included: “To oversee and monitor the health, safety and welfare of all officers across the King’s Cross estate using CCTV, Facewatch and surveillance tactics.”

The advert was later amended to remove this detail, after BBC News raised the issue.

Following the FT’s report, the Information Commissioner’s Office (ICO) launched an investigation into how the facial-recognition data gathered was being stored.

The Mayor of London, Sadiq Khan, also wrote to the King’s Cross Central development group asking for reassurance its use of facial-recognition technology was legal.

The latest statement was posted online on the eve of technology giant Samsung opening an event space on the site, with a launch event planned for Tuesday evening, 3 September.

The FT reporter who broke the original story described the statement as “strange”.

One critic of facial-recognition technology, Dr Stephanie Hare, said many questions remained about what had been going on in the area, which, while privately owned, is open to the public and contains a number of bars, restaurants and family spaces.

“It does not change the fundamentals of the story in terms of the implications for people’s privacy and civil liberties, or the need for the ICO to investigate – they deployed this technology secretly for nearly two years,” she said.

“Even if they deleted data, I would want to know, ‘Did they do anything with it beforehand, analyse it, link it to other data about the people being identified? Did they build their own watch-list? Did they share this data with anyone else? Did they use it to create algorithms that have been shared with anyone else? And most of all, were they comparing the faces of people they gathered to a police watch-list?'”

Dr Hare also said it was unclear why the partnership had stopped using it.

“Was it not accurate? Ultimately unhelpful? Or did they get what they needed from this 22-month experiment?” she said.

Let’s block ads! (Why?)

BBC News – Technology

Data regulator probes King’s Cross facial recognition tech

The facial-recognition system at King’s Cross is to be investigated by the UK’s data-protection watchdog.

Media exposure of live facial recognition at the site prompted the Information Commissioner’s Office (ICO) to look into how it was being used.

The ICO will inspect the technology in place and how it is operated to ensure it does not break data protection laws.

The regulator said it was “deeply concerned” about the growing use of facial-recognition technology.

Fair and transparent

The Financial Times was the first to report a live face-scanning system was being used across the 67-acre (0.3-sq-km) site around King’s Cross station in London.

Developer Argent said it used the technology to “ensure public safety” and it was just one of “a number of detection and tracking methods” in place at the site.

But the use of cameras and databases to work out who is passing through and using the site has proved controversial.

So far, Argent has not said how long it has been using facial-recognition cameras, what is the legal basis for their use, or what systems it has in place to protect the data it collects.

In its statement, the ICO said: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all.”

The regulator said it was keen to ensure that King’s Cross developer was using the technology in accordance with UK laws governing the use of data.

“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way,” said the ICO.

It must have documented how and why it believed its use of the technology was legal, proportionate and justified, it added.

Argent has not yet responded to a request for comment by BBC News.

The mayor of London is also quizzing developer Argent about its use of facial-recognition systems.

Sadiq Khan wrote to the company and said there was “serious and widespread concern” about the legality of facial recognition.

Let’s block ads! (Why?)

BBC News – Technology

King’s Cross developer defends use of facial recognition

The developer behind a 67-acre site in the King’s Cross area of central London has defended its use of facial recognition technology.

Under data protection laws, firms must provide clear evidence that there is a need to record and use people’s images.

A spokeswoman said the tool was used to “ensure public safety” and was one of “a number of detection and tracking methods”.

The local council said it was unaware that the system was in place.

It was first reported by the Financial Times.

In a statement, developer Argent said it used cameras “in the interest of public safety” and likened the area to other public spaces.

“These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public,” it said.

A spokeswoman declined to say what those systems were, how long the facial recognition had been in operation or what the legal basis was for its use, as is required under European data protection law.

Potential for inappropriate use

In addition to the National Rail, London Underground and Eurostar stations, King’s Cross is home to a number of restaurants, shops and cafes, as well as offices occupied by Google and Central Saint Martins college.

The college told the BBC it had “not been made specifically aware” that the tech was in use in the area and added that it does not use it inside its own buildings.

According to the King’s Cross website, planning permission for new additions to the site, granted in 2006, included:

  • 50 buildings
  • 1,900 homes
  • 20 streets
  • 10 public parks

The BBC has confirmed that London’s Canary Wharf is also seeking to trial facial recognition tools, as reported in the Financial Times.

The Information Commissioner’s Office (ICO) said it had general concerns about the potential for inappropriate use of the technology.

“Organisations wishing to automatically capture and use images of individuals going about their business in public spaces need to provide clear evidence to demonstrate it is strictly necessary and proportionate for the circumstances, and that there is a legal basis for that use,” it said in a statement.

“The ICO is currently looking at the use of facial recognition technology by law enforcement in public spaces and by private sector organisations, including where they are partnering with police forces.

“We’ll consider taking action where we find non-compliance with the law.”

South Wales Police faced a legal challenge to its use of facial recognition in 2018.

Despite this it is currently undergoing a three-month trial of a new app.

Chancellor Sajid Javid gave his backing to the police in their trials of facial recognition cameras last month, while he was home secretary.

However, privacy groups have also voiced concerns about the implications of facial recognition on privacy rights.

“Facial recognition is nothing like CCTV – it’s not an accurate comparison,” said Stephanie Hare, an independent researcher and tech commentator.

“It allows us to be identified and tracked in real time, without our knowledge or our informed consent.

“We recognise the power of DNA and fingerprints as biometrics and their use is governed very strictly under UK law. We do not apply the same protections and restrictions to face, yet it is arguably even more powerful precisely because it can be taken without our knowledge.”

Let’s block ads! (Why?)

BBC News – Technology

U.S. judge dismisses suit versus Google over facial recognition software

© Reuters. Google signage is seen at the Google headquarters in the Manhattan borough of New York City, New York © Reuters. Google signage is seen at the Google headquarters in the Manhattan borough of New York City, New York

(Reuters) – A lawsuit filed against Google by consumers who claimed the search engine’s photo sharing and storage service violated their privacy was dismissed on Saturday by a U.S. judge who cited a lack of "concrete injuries."

U.S. District Judge Edmond Chang in Chicago granted a Google motion for summary judgment, saying the court lacked "subject matter jurisdiction because plaintiffs have not suffered concrete injuries."

The suit, filed in March 2016, alleged Alphabet (NASDAQ:) Inc’s Google violated Illinois state law by collecting and storing biometric data from people’s photographs using facial recognition software without their permission through its Google Photos service.

Plaintiffs had sought more than $ 5 million collectively for the "hundreds of thousands" of state residents affected, according to court documents. Plaintiffs had asked the court for $ 5,000 for each intentional violation of the Illinois Biometric Information Privacy Act, or $ 1,000 for every negligent violation, court documents said.

Attorneys for the plaintiffs as well as officials with Google could not immediately be reached to comment. Google had argued in court documents that the plaintiffs were not entitled to money or injunctive relief because they had suffered no harm.

The case is Rivera v Google, U.S. District Court, Northern District of Illinois, No. 16-02714.

Disclaimer: Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. All CFDs (stocks, indexes, futures) and Forex prices are not provided by exchanges but rather by market makers, and so prices may not be accurate and may differ from the actual market price, meaning prices are indicative and not appropriate for trading purposes. Therefore Fusion Media doesn`t bear any responsibility for any trading losses you might incur as a result of using this data.

Fusion Media or anyone involved with Fusion Media will not accept any liability for loss or damage as a result of reliance on the information including data, quotes, charts and buy/sell signals contained within this website. Please be fully informed regarding the risks and costs associated with trading the financial markets, it is one of the riskiest investment forms possible.

Let’s block ads! (Why?)

Technology News