Police guidance on facial recognition technology ‘a hammer blow to privacy’ – The Independent

Posted March 22nd, 2022 in facial mapping, identification, news, police, privacy, victims, witnesses by tracey

‘Innocent people like victims and potential witnesses could be placed on police watchlists under guidance on the use of facial recognition systems, civil liberties groups have warned.’

Full Story

The Independent, 22nd March 2022

Source: www.independent.co.uk

Facial recognition firm faces possible £17m privacy fine – BBC News

Posted November 30th, 2021 in artificial intelligence, data protection, facial mapping, fines, news, privacy by tracey

‘An Australian firm which claims to have a database of more than 10 billion facial images is facing a potential £17m fine over its handling of personal data in the UK.’

Full Story

BBC news, 29th November 2021

Source: www.bbc.co.uk

Uber Faces Legal Action Over ‘Racist’ Facial Recognition Software – Each Other

‘Uber is facing legal action following revelations that its facial recognition algorithm is five times more likely to cause the termination of darker-skinned workers.’

Full Story

Each Other, 11th October 2021

Source: eachother.org.uk

Uber facing new UK driver claims of racial discrimination – The Guardian

‘Uber is facing further claims for compensation over racial discrimination from drivers who say they had been falsely dismissed because of malfunctioning face recognition technology.’

Full Story

The Guardian, 6th October 2021

Source: www.theguardian.com

Civil liberties groups demand ban of use of facial recognition technology by police – Local Government Lawyer

‘Liberty, Privacy International and 29 other organisations have called for Parliament to ban the use of live facial recognition technology (LFRT) by the police and private companies.’

Full Story

Local Government Lawyer, 31st August 2021

Source: www.localgovernmentlawyer.co.uk

New police CCTV use rules criticised as bare bones – BBC News

‘A proposed code of practice covering police use of live facial recognition in England and Wales has been criticised by human rights groups.’

Full Story

BBC News 17th August 2021

Source: www.bbc.co.uk

Investigation of organisations using live facial recognition technology in public spaces found none compliant with data protection law: ICO – Local Government Lawyer

Posted June 18th, 2021 in data protection, facial mapping, local government, news, ombudsmen, privacy by tracey

‘An investigation by the Information Commissioner’s Office (ICO) published today (17 June) found that out of a group of organisations using live facial recognition (LFR) technology in public spaces, none were fully compliant with data protection law requirements.’

Full Story

Local Government Lawyer, 18th June 2021

Source: www.localgovernmentlawyer.co.uk

Ensuring the lawfulness of automated facial recognition surveillance in the UK – Oxford Human Rights Hub

‘In R(Bridges) v South Wales Police, the England and Wales Court of Appeal reviewed the lawfulness of the use of live automated facial recognition technology (‘AFR’) by the South Wales Police Force. CCTV camera­­s capture images of the public, which are then compared with digital images of persons on a watchlist.’

Full Story

Oxford Human Rights Hub, 3rd September 2020

Source: ohrh.law.ox.ac.uk

Policing Our Privacy – Where Does the Law Lie? – 39 Essex Chambers

‘Last Tuesday the Court of Appeal (Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Singh LJ) allowed the appeal of the civil liberties campaigner, Edward Bridges, against the decision of the Divisional Court which had dismissed his claim for judicial review of South Wales Police Force’s use of live automated facial recognition technology (“AFR”).’

Full Story

39 Essex Chambers, 17th August 2020

Source: www.39essex.com

Facial Recognition Technology not “In Accordance with Law” – UK Human Rights Blog

‘The Court of Appeal, overturning a Divisional Court decision, has found the use of a facial recognition surveillance tool used by South Wales Police to be in breach of Article 8 of the European Convention on Human Rights (ECHR). The case was brought by Liberty on behalf of privacy and civil liberties campaigner Ed Bridges. The appeal was upheld on the basis that the interference with Article 8 of the ECHR, which guarantees a right to privacy and family life, was not “in accordance with law” due to an insufficient legal framework. However, the court found that, had it been in accordance with law, the interference caused by the use of facial recognition technology would not have been disproportionate to the goal of preventing crime. The court also found that Data Protection Impact Assessment (DPIA) was deficient, and that the South Wales Police (SWP), who operated the technology, had not fulfilled their Public Sector Equality Duty.’

Full Story

UK Human Rights Blog, 13th August 2020

Source: ukhumanrightsblog.com

Police’s Automated Facial Recognition Deployments Ruled Unlawful by the Court of Appeal – Doughty Street Chambers

‘R. (Bridges) v Chief Constable of South Wales [2020] EWCA Civ 1058 [2020] 8 WLUK 64 is thought to be the first case in the world to consider the use of facial recognition technology by law enforcement agencies. In this short article, we explore the judgment and its implications for the deployment of these and similar technologies in future.’

Full Story

Doughty Street Chambers, 12th August 2020

Source: insights.doughtystreet.co.uk

Let’s face it: use of automated facial recognition technology by the police – UK Police Law Blog

‘The case of R (Bridges) v Chief Constable of South Wales Police & Information Commissioner [2020] EWCA Civ 1058 (handed down on 11 August 2020) was an appeal from what is said to have been the first claim brought before a court anywhere on planet earth concerning the use by police of automated facial recognition (“AFR”) technology. There could be nothing wrong with posting scores of police officers with eidetic memories to look out for up to a 800 wanted persons at public gatherings. So why not use a powerful computer, capable of matching 50 faces a second with a database of (under) 800 suspects, to do this job much more cheaply and instantaneously, flagging any matches to a human operator for final assessment? According to the Court of Appeal in Bridges, this system constitutes an interference with Article 8 rights which is not such as is in accordance with the law, but which (critically) would be proportionate if a sufficiently narrow local policy were framed.’

Full Story

UK Police Law Blog, 11th August 2020

Source: ukpolicelawblog.com

South Wales police lose landmark facial recognition case – The Guardian

‘Campaigners are calling for South Wales police and other forces to stop using facial recognition technology after the court of appeal ruled that its use breached privacy rights and broke equalities law.’

Full Story

The Guardian, 11th August 2020

Source: www.theguardian.com

‘Deepfake’ warning over online courts – Legal Futures

‘Video manipulation software, including ‘deepfake’ technology, poses problems for remote courts in verifying evidence and that litigants or witnesses are who they say they are, a report has warned.’

Full Story

Legal Futures, 29th July 2020

Source: www.legalfutures.co.uk

UK’s facial recognition technology ‘breaches privacy rights’ – The Guardian

‘Automated facial recognition technology that searches for people in public places breaches privacy rights and will “radically” alter the way Britain is policed, the court of appeal has been told.’

Full Story

The Guardian, 23rd June 2020

Source: www.theguardian.com

Equality watchdog demands suspension of use of automated facial recognition and predictive algorithms in policing – Local Government Lawyer

‘The Equality and Human Rights Commission (EHRC) has called for the suspension of the use of automated facial recognition (AFR) and predictive algorithms in policing in England and Wales, “until their impact has been independently scrutinised and laws are improved”.’

Full Story

Local Government Lawyer, 13th March 2020

Source: www.localgovernmentlawyer.co.uk

Let’s face it: use of automated facial recognition technology by the police – UK Police Law Blog

‘The case of R (Bridges) v Chief Constable of South Wales Police & Information Commissioner [2019] EWHC 2341 (Admin); [2020] 1 WLR 672 is said to have been the first claim brought before a court anywhere on planet earth concerning the use by police of automated facial recognition (“AFR”) technology. There could be nothing wrong with posting scores of police officers with eidetic memories to look out for up to a 800 wanted persons at public gatherings. So why not use a powerful computer, capable of matching 50 faces a second with a database of (under) 800 suspects, to do this job much more cheaply and instantaneously, flagging any matches to a human operator for final assessment? According to the Divisional Court in Bridges, this may, depending on the facts of each particular deployment, be lawful.’

Full Story

UK Police Law Blog, 21st February 2020

Source: ukpolicelawblog.com

Rules urgently needed to oversee police use of data and AI – report – The Guardian

‘National guidance is urgently needed to oversee the police’s use of data-driven technology amid concerns that it could lead to discrimination, a report has said.’

Full Story

The Guardian, 23rd February 2020

Source: www.theguardian.com

Watchdog rejects Met’s claim that he supported facial recognition – The Guardian

Posted February 13th, 2020 in equality, facial mapping, London, news, police by tracey

‘The official biometrics commissioner has rebuked the Metropolitan police after it falsely claimed that he supported its use of facial recognition CCTV in an equalities impact assessment published as the force made its first operational use of the controversial technology.’

Full Story

The Guardian, 12th February 2020

Source: www.theguardian.com

Met police deploy live facial recognition technology – The Guardian

‘The Metropolitan police have been accused of defying the warnings of its own watchdogs by beginning operational use of facial recognition CCTV, despite a scathing assessment of its effectiveness from the expert hired to scrutinise its trials.’

Full Story

The Guardian, 11th February 2020

Source: www.theguardian.com