FRT

“Smile! You’re on Camera” – The Implications of the Use of Facial Recognition Technology

(Source) What is the first thing that comes to mind when you hear the phrase ‘facial recognition technology’? Is it a TV show or movie scene where law enforcement is staring at computer monitors as faces in a database cycle through as a software program looks for a match to an image of a suspect, victim, or witness in the case? Many associate the phrase ‘facial recognition technology’ with the government and law enforcement; an association which is reinforced by the way in which numerous procedural TV shows (such as FBI, Hawaii Five-0, Blue Bloods, and Law and Order: SVU) display facial recognition in their episodes. For many Americans, those TV and movie scenes are their primary exposure to facial recognition, resulting in the stronger association of facial recognition as a law enforcement aid. While facial recognition technology (also known as facial recognition or FRT) is certainly a tool used by government and law enforcement officials, its uses and capabilities span far beyond what is depicted by the entertainment industry. The concept of facial recognition originally began in the 1960s with a semi-automated system, which required an administrator to select facial features on a photograph before the software calculated and [read more]

Facial Recognition Software, Race, Gender Bias, and Policing

(Source)   Facial Recognition Technology (FRT) identifies a person’s face by navigating through computer programs that access thousands of cameras worldwide to identify a suspect, criminal or fugitive. FRT could even accurately identify a person’s from a blurry captured image or instantaneously identify the subject among a crowd. This is the fantasy portrayed in Hollywood movies. In reality, facial recognition software is inaccurate, bias, and under-regulated. FRT creates a facial template of a person’s facial image and compares the template to millions of photographs stored in databases—driver’s license, mugshots, government records, or social media accounts. While this technology aims to accelerate law enforcement investigative work and more accurately identify crime suspects, it has been criticized for its bias against African-American Women.  The arm of the United States government responsible for establishing standards for technology—National Institute For Standards and Technology (NIST)—conducted a test in July of 2019 that showed FRT’s bias against African-American women. NIST tested Idemia’s algorithms—an Australian company that provides facial recognition software to police in United States—for accuracy. The software mismatched white women’s faces once in 10,000, but it mismatched black women’s faces once in 1000 instances. This disparity is a result of the failure of current facial [read more]