What is the first thing that comes to mind when you hear the phrase ‘facial recognition technology’? Is it a TV show or movie scene where law enforcement is staring at computer monitors as faces in a database cycle through as a software program looks for a match to an image of a suspect, victim, or witness in the case? Many associate the phrase ‘facial recognition technology’ with the government and law enforcement; an association which is reinforced by the way in which numerous procedural TV shows (such as FBI, Hawaii Five-0, Blue Bloods, and Law and Order: SVU) display facial recognition in their episodes. For many Americans, those TV and movie scenes are their primary exposure to facial recognition, resulting in the stronger association of facial recognition as a law enforcement aid. While facial recognition technology (also known as facial recognition or FRT) is certainly a tool used by government and law enforcement officials, its uses and capabilities span far beyond what is depicted by the entertainment industry.
The concept of facial recognition originally began in the 1960s with a semi-automated system, which required an administrator to select facial features on a photograph before the software calculated and compared data. In the decades following, the technology improved, becoming entirely automated. The U.S. federal government invested heavily in the technology’s development in the mid-1990s, assessing various prototypes of facial recognition through the Defense Advanced Research Products Agency (DARPA). While the idea of facial recognition technology had been around for decades, the American public’s first knowledge of and reaction to the technology followed a trial implementation at the 2001 Super Bowl where FRT compared surveillance images against a database of digital mugshots. Throughout the past two decades, this technology has become widespread, particularly within government and law enforcement communities, by providing support as an investigative and preventive tool to help identify missing children, minimize fraud, and combat passport fraud.
The government’s investment and continued use over the past two decades, however, “propelled face recognition from its infancy to a market of commercial products,” resulting in the use of facial recognition in both public and private arenas. Aside from serving as a tool for law enforcement, today’s version of facial recognition technology also aids corporations, functioning as a “security feature of choice for phones, laptops, passports, and payment apps.” Major corporations such as Facebook and Shutterfly use facial recognition as part of their photo-tagging software whereas others like Microsoft and Apple use facial recognition technology as a security feature for users to unlock their phones and computers. Even companies like The Home Depot and Lowe’s are implementing facial recognition software, relying on it as a loss prevention tool.
Despite the technology’s increasingly pervasive use in the private and public sector (exemplified by London’s recent announcement that the city will be incorporating live facial recognition technology into the city’s surveillance cameras), clear guidelines and policies addressing legal and privacy concerns do not yet exist. Some jurisdictions have focused on addressing the issues regarding the public sector’s use (i.e., the police’s use of facial recognition as an investigative tool and as a complement to surveillance systems). San Francisco and Oakland issued bans on facial recognition within their respective jurisdictions, specifically prohibiting use by police and other local agencies, citing bias concerns (including the technology’s historically higher inaccuracy rate with people of color and women) as part of the reasoning behind the ban.
Other jurisdictions have prioritized the issues surrounding private companies’ use of facial recognition technology. For instance, the City Council in Portland, Oregon is exploring a potential ban on the use of facial recognition by private entities. The city council’s proposal has created tension between local legislators and businesses, with many business owners concerned that an outright ban could negatively impact their ability to grow and protect their businesses. A few other states, notably Illinois, have already enacted legislation that encompasses private companies’ use of facial recognition. Illinois’ Biometric Information Privacy Act (BIPA) permits citizens to sue corporations for collecting or storing their biometric information. In the landmark case Rosenbach v. Six Flags Entertainment Corp., decided in January 2019, the Illinois Supreme Court “held that private individuals may bring suit even if the only harm was a violation of their legal rights.” The Illinois Supreme Court sided against Six Flags, upholding citizens’ right to sue corporations for collecting biometric data without informing consumers how that data would be used. This expansive reading of the statute opens corporations up to significant liability issues stemming from the procedural requirements for the collection and storage of biometric data in Illinois. In fact, class action lawsuits against major corporations have been filed since the Illinois Supreme Court’s ruling. These class actions are targeting tech giants like Facebook, who recently agreed to a settlement of $550 million for its class action stemming from the use of the platform’s photo-tagging software, as well as brick-and-mortar retailers The Home Depot and Lowe’s, who are currently battling a class action lawsuit arising out of the use of facial recognition as a loss prevention tool in stores.
Although some legislatures have attempted to address legal and privacy concerns surrounding facial recognition technology by creating restrictions or parameters on the technology’s use, many jurisdictions do not have any laws directed focused on FRT. Even the few jurisdictions – such as San Francisco and Illinois – that have enacted legislation are still struggling to navigate regulation and enforcement. Academics who have studied and identified biases in the technology voiced their concerns regarding the sale and use of FRT, stating that “these systems have systemic design flaws that have not been fixed… [which] may well negate their effectiveness.” These bias concerns have become politically and socially significant, with jurisdictions and members of Congress rethinking a larger roll out of facial recognition technology. Privacy advocates focus more on issues arising from the collection, storage, and potential distribution of individuals’ biometric information. Specifically, these advocates point to the issue of consent, noting that individuals’ faces are collected and then stored in a database without their knowledge or consent. This issue of consent is the centerpiece of the class action lawsuits against The Home Depot and Lowe’s – the plaintiffs’ central arguments are that the corporations violated Illinois’ BIPA by using facial recognition with their store surveillance systems as a loss prevention tool. Both suits, however, were recently withdrawn by the plaintiffs, without explanation or court action. Despite the withdrawal of these cases, the issue of consent is still being challenged and explored, as evidenced by the March 2020 filing of a lawsuit against the Chicago Blackhawks. In the complaint, the plaintiff alleges that the National Hockey League team has scanned his and other fans’ face biometrics at home games at the United Center using the facility’s security cameras since 2014. The progression of this lawsuit has the potential to clarify requirements regarding notice and consent for brick-and-mortar retailers, entertainment venues, and businesses operating out of a physical location.
Out of all these debates, a single question arises – how will our society choose to balance technology’s pervasiveness in everyday life with individual privacy? Given the widespread use of camera and surveillance systems by public and private entities, society must dictate when individuals still have an expectation of privacy when out in public. Furthermore, individuals must determine how much privacy they are willing to forgo for the ease and convenience of technology.
Society, however, has not yet struck this balance between privacy and facial recognition technology. As the bias concerns voiced indicate, the technology’s accuracy in properly identifying individuals – particularly those of color – needs to improve, especially since FRT serves as an investigative tool for law enforcement. These concerns have gained political and societal momentum, further fueling the debates over facial recognition technology. Facial recognition is becoming more commonplace in everyday life, however, and some Americans have already chosen to relinquish some individual privacy for the convenience of the technology. For instance, Apple’s Face ID and Microsoft’s Windows Hello have been widely accepted by consumers in the few years since their respective launches. Thus, it appears that society has generally accepted the use of facial recognition technology in personal electronic devices. The sticking point instead seems to be about the technology’s use in connection with surveillance and camera systems by public and/or private entities.
The modern version of facial recognition technology is still in its earlier developmental stages, resulting in a lack of clarity regarding its potential uses and capabilities. Subsequently, legislative enforcement and regulatory policies remain largely undeveloped and the few jurisdictions with enacted legislation are struggling to navigate this unknown territory. As facial recognition technology continues to improve, however, society must determine how much privacy it is willing to give up in exchange for the efficiency, ease, and convenience of facial recognition technology.
About the Author: Elise Kletz is a second-year law student at Cornell Law School. Elise earned a B.S.B.A. from the Olin Business School at Washington University in St. Louis, where she completed a double major in Leadership & Strategic Management and in Marketing. Currently, Elise serves as an editor for The Issue Spotter and serves as Co-President for the Business Law Society and the Jewish Law Student Association.
Suggested Citation: Elise Kletz, “Smile! You’re on Camera” – The Implications of the Use of Facial Recognition Technology, Cornell J.L. & Pub. Pol’y, The Issue Spotter, (May 1, 2020), http://jlpp.org/blogzine/smile-youre-on-camera-the-implication-of-the-use-of-facial-recognition-technology.