privacy

Zooming in on Student Surveillance: Protecting Student Privacy in the Age of COVID-19

(Source) Exams are stressful even under the best of conditions. Exams taken virtually, as so many students over this previous year have found out, have presented a brand new set of challenges that can magnify student stress. But, imagine for a moment that you cannot even get into your exam, because the exam software does not recognize your face, or that the eye-movement tracking system built into the exam software could mean that looking away momentarily from your computer screen would result in you being flagged for cheating. This is, in fact, the reality that ample students have faced over the past year of virtual learning and testing. Indeed, when COVID-19 hit in early 2020, teachers and their students, from kindergarten to graduate school, had to quickly pivot to virtual learning and testing modalities. While this transition was certainly a necessity to keep students, their teachers, and their families safe, virtual learning and testing nonetheless raises civil liberties concerns around privacy and freedom of speech and perpetuates inequality for those who are people of color, low income, non-binary, or neurodivergent. In the United States, unlike in other countries, there is no “national” privacy law. Rather, a patchwork of laws make [read more]

“Smile! You’re on Camera” – The Implications of the Use of Facial Recognition Technology

(Source) What is the first thing that comes to mind when you hear the phrase ‘facial recognition technology’? Is it a TV show or movie scene where law enforcement is staring at computer monitors as faces in a database cycle through as a software program looks for a match to an image of a suspect, victim, or witness in the case? Many associate the phrase ‘facial recognition technology’ with the government and law enforcement; an association which is reinforced by the way in which numerous procedural TV shows (such as FBI, Hawaii Five-0, Blue Bloods, and Law and Order: SVU) display facial recognition in their episodes. For many Americans, those TV and movie scenes are their primary exposure to facial recognition, resulting in the stronger association of facial recognition as a law enforcement aid. While facial recognition technology (also known as facial recognition or FRT) is certainly a tool used by government and law enforcement officials, its uses and capabilities span far beyond what is depicted by the entertainment industry. The concept of facial recognition originally began in the 1960s with a semi-automated system, which required an administrator to select facial features on a photograph before the software calculated and [read more]

Facial Recognition Software, Race, Gender Bias, and Policing

(Source)   Facial Recognition Technology (FRT) identifies a person’s face by navigating through computer programs that access thousands of cameras worldwide to identify a suspect, criminal or fugitive. FRT could even accurately identify a person’s from a blurry captured image or instantaneously identify the subject among a crowd. This is the fantasy portrayed in Hollywood movies. In reality, facial recognition software is inaccurate, bias, and under-regulated. FRT creates a facial template of a person’s facial image and compares the template to millions of photographs stored in databases—driver’s license, mugshots, government records, or social media accounts. While this technology aims to accelerate law enforcement investigative work and more accurately identify crime suspects, it has been criticized for its bias against African-American Women.  The arm of the United States government responsible for establishing standards for technology—National Institute For Standards and Technology (NIST)—conducted a test in July of 2019 that showed FRT’s bias against African-American women. NIST tested Idemia’s algorithms—an Australian company that provides facial recognition software to police in United States—for accuracy. The software mismatched white women’s faces once in 10,000, but it mismatched black women’s faces once in 1000 instances. This disparity is a result of the failure of current facial [read more]

Do Not Access – Is Law Enforcement Access to Commercial DNA Databases a Substantial Privacy Concern?

(Source) The use of forensic genetic genealogy (FGG) as an investigative tool for law enforcement has become, “if not exactly routine, very much normalized.” The normalization is in large part due to law enforcement’s use of FGG to identify and arrest the Golden State Killer. The April 2018 arrest gained national recognition, and subsequently, so did the police’s use of FGG as an investigative tool to narrow in on suspects. Forensic genetic genealogy has immense potential to serve as an investigative tool for law enforcement. The technique helps investigators “reduce the size of the haystack” by identifying the suspect’s family—making it that much more probable to find the needle. In the case of the Golden State Killer, law enforcement used GEDmatch, a public website that produces possible familial matches based on users’ genetic profiles. The site allows users to upload genetic profiles from third parties (such as 23andme and Ancestry.com), which is how law enforcement uploaded a DNA profile of the suspect from the Golden State Killer case. GEDmatch produced a partial match to the DNA profile, uploaded under a fake name, which led law enforcement to a distant relative. By narrowing down the possible suspect pool to one family [read more]