Skip navigation

Overview: Facial Recognition in Criminal Enforcement

Introduction

While the title of the article may have made you chuckle, the reality is quite depressing. This article explores the use of facial recognition technology (FRT) in criminal identification and the problems that arise with its implementation along with case laws wherein the Judiciary has highlighted the same.

The Rise of Facial Recognition in Criminal Law Enforcement

Facial recognition refers to the process of identifying or confirming an individual's identity based on their facial features. This technology relies on biometric or behavioral features such as fingerprints, iris scans, voice patterns, or signatures. Now, with advancements in artificial intelligence (AI) and deep learning, FRT has evolved into an integral tool in what some call “smart criminal justice.”

Criminal law enforcement has also witnessed a surge in AI-driven tools in recent years. FRT is now a key component of security and criminal systems in many developed countries. However, its deployment has raised several legal eyebrows.

Legal Challenges and Court Rulings

United Kingdom: Bridges Case

The UK Court of Appeal addressed these concerns in R (Bridges) v. Chief Constable of South Wales Police [(2020) EWCA Civ 1058.] The court ruled that the use of Automated Facial Recognition (AFR) Technology by the South Wales Police violated Article 8 of the European Convention on Human Rights. The ruling highlights how this technology indiscriminately collected and processed biometric data without consent, thereby infringing on an individual’s right to privacy. Furthermore, the Data Protection Impact Assessment (DPIA) conducted by the police failed to comply with the Data Protection Act 2018, as it did not properly assess the risks to people’s rights and freedoms.

The case perfectly encapsulates how existing legal safeguards, while intended to regulate sensitive personal data, struggle to address the complex implications of FRT. The Court of Appeal acknowledged that the legal framework governing facial recognition must be periodically reassessed and refined in response to technological advancements.

France: Proportionality in Schools

In 2019, the regional council in South France deployed face-scanning tools at the gates of two high schools in Nice and Marseille to regulate the entry of students. The French Data Protection Authority (CNIL) found that this violated the principle of proportionality under the General Data Protection Regulation (GDPR). The French Administrative Court upheld CNIL’s decision, emphasizing:

  1. The regional authority acted ultra vires (beyond its legal power), as the school’s security fell under the Head of School’s jurisdiction.
  2. There was a lack of proper informed consent for data collection.
  3. Less intrusive alternatives, such as ID badges and video surveillance, could achieve the same objectives without compromising the privacy of individuals.

Wrongful Arrests and Bias in FRT

United States: Robert Williams’ Wrongful Arrest

In January 2020, Detroit police wrongfully arrested Robert Williams in front of his family, detaining him for thirty hours under unsanitary and unhygienic conditions. The case of Williams v. City of Detroit , [2:21-cv-10827, (U.S. District Court for the Eastern District of Michigan)], became the very first publicly known instance of a wrongful arrest due to an FRT misidentification which resulted in an unlawful arrest. There was a settlement in June 2024 amd now, Detroit police are now required to:

  • Support FRT results with independent, reliable evidence before making an actual arrest.
  • Undergo specialized training on the risks of facial recognition, particularly its higher error rates for people of marginalized communities.

Clearview AI and Mass Surveillance

In the case of ACLU v. Clearview AI, Inc.[ 2021 Ill. Cir. LEXIS 292.], the American Civil Liberties Union (ACLU) sued Clearview AI for its secretive mass surveillance practices. Clearview’s technology scraped through billions of faceprints from online images, raising serious privacy concerns. This resulted in Clearview AI being:

  • Permanently banned in the U.S. from providing its database to private entities.
  • Prohibited from selling its facial recognition services to any entity in Illinois for five whole years.

Despite the above-mentioned case laws, as of now there has been little research examining the regulation from a law enforcement perspective or evaluating how these rules might govern FRT use in criminal proceedings (i.e., for the investigation of crimes).

Conclusion: A Call for Regulation

While FRT holds great promise for crime prevention, its widespread use raises pressing legal, ethical, and human rights concerns. Cases across the world have demonstrated the potential for privacy breaches, wrongful arrests, and biased identification. There is an urgent need for a robust legal frameworks, along with periodic reviews, and stricter accountability measures to prevent misuse. Without proper regulation in place- the future is in dire straits.

The author affirms that this article is an entirely original work, never before submitted for publication at any journal, blog or other publication avenue. Any unintentional resemblance to previously published material is purely coincidental. This article is intended solely for academic and scholarly discussion. The author takes personal responsibility for any potential infringement of intellectual property rights belonging to any individuals, organizations, governments, or institutions.

References-

  1. Giuseppe Mobilio, ‘Your face is not new to me – Regulating the surveillance power of facial recognition technologies’ (2023) 12(1) Internet Policy Review <https://doi.org/10.14763/2023.1.1699> accessed on 26th January, 2025.
  2. R (Bridges) v Chief Constable of South Wales Police, (2020) EWCA Civ 1058.
  3. The Data Protection Act, 2018.
  4. < https://www.hoganlovells.com/en/publications/facial-recognition-challenged-by-french-administrative-court#:~:text=In%20a%20decision%20dated%2027,and%20speed%20up%20entry%20of> accessed on 26th January, 2025.
  5. Williams v. City of Detroit, 2:21-cv-10827, (U.S. District Court for the Eastern District of Michigan).
  6. https://www.media.mit.edu/projects/gender-shades/overview/ Accessed on 27th January, 2025.
  7. ACLU v. Clearview Ai, Inc., 2021 Ill. Cir. LEXIS 292.
  8. Vera Lucia Raposo, ‘The use of facial recognition technology by law enforcement in Europe: a non-orwellian draft proposal’ (2022) 29 European Journal on Criminal Policy and Research 515.

Liked the article ?
Share this: