BBCTechnology

Police facial recognition system faces legal challenge

A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group.
Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases.
Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act.
The Metropolitan Police says the technology will help keep London safe.
The system is being piloted in London, with three other forces – Humberside, South Wales, and Leicestershire – also trialling the technology.
‘Very intrusive’
However, it has proved controversial, with one watchdog describing its use in public places as “very intrusive”.
Court documents, seen by the BBC, also claim the Home Office has failed in its duty to properly regulate AFR’s use.
Manufacturers of the systems say they can monitor multiple cameras in real time “matching” thousands of faces a minute with images already held by the police – often mugshots taken of suspects who have been taken into custody.
However, Big Brother Watch says the Met’s own research, published in May, shows that during trials only two genuine matches were made out of 104 system “alerts”.
The group also takes issue with the length of time the images gathered by AFR are held.
The Met piloted the system at Notting Hill Carnival in 2016 and 2017, at the Cenotaph on Memorial Sunday, and at Westfield Shopping Centre in Stratford last month.
Further trials are planned. The force says the technology is “an extremely valuable tool”.
Meanwhile, in South Wales, police used AFR at least 18 times between May 2017 and March 2018, according to court documents.

Source: BBC
tags:

Comment here