Facial recognition technology has become pervasive in both the public and private sector. Facial recognition is one warrantless tool used by law enforcement. It works
by using “biometric software to map a person’s facial features from a video or phone.” The system then attempts to match the information from the software on databases. While three cities have banned the use of facial-recognition technology and California has banned law enforcement from using facial recognition in officer cameras, the use of it across the United States is widespread. The Michigan State Police alone has a database containing more than 50 million photos from which searches are run as requested by law enforcement agencies. There is fierce debate on the topic, and only through analysis of both the benefits and detriments of facial recognition use by law enforcement can a proper assessment be made on its broad impact. The private sector has propelled facial recognition technology into public use and has numerous problems, but when used in tandem with human recognition capacity, facial recognition technology has the potential to better protect the public.
First, facial recognition has high error rates and there have been cases of manipulated images. Additionally, there is a large concern that its use will “disparately impact people of color.” In one federal study, “Asian and African American people were up to 100 times more likely to be misidentified than white men.” In an experiment conducted by the American Civil Liberties Union, 26 California legislators were inaccurately matched with mug shots in a common face-scanning program. In a similar experiment, software incorrectly matched 28 federal legislators with mugshots. There are numerous alleged privacy and civil liberties violations and ongoing lawsuits against the Justice Department, FBI, and Drug Enforcement Administration for the nationwide use of facial-recognition software.
Second, the private sector is responsible for developing and selling facial recognition technology used by police departments. Companies such as Clearview AI (which works with more than 600 law enforcement agencies alone) scrape billions of photos from the internet and provide them to law enforcement. The NYPD tried the Clearview app and expressed concerns about who had access to photos once they were uploaded to it. The NYPD declined to use the app long term, but New York Police officers are still using the app on their personal devices. The Washington County Sheriff’s Office uses Amazon’s facial recognition software which has higher mistake rates for female and darker-skinned faces. Amazon’s “Rekognition” technology is breaking into the estimated $8 billion global market value (by 2022) of facial recognition software globally. Amazon is reaping large financial benefits but does not know how many police departments use its technology, making oversight on potential misuse or errors difficult. With so much money at stake, private enterprises are foregoing privacy interests to cut into a growing market, even when their products are flawed and exploitative. Not all private firms are buying into the surveillance state, as some tech executives refuse to sell their products to police departments because of the error risk, regardless of potential profit.
Alternatively, the potential for positive outcomes with facial recognition technology is promising. An accused man was tracked down less than a day after he allegedly attempted to violently kidnap and rape a woman. Eugene O’Donnell, professor and former police officer, stated: “Chances are this wouldn’t have been solved . . . It’s not when it’s solved – it’s if.” An NYPD spokeswoman said that facial recognition technology is not a solo arrest basis, but a lead that is “valuable in combatting crime.” In another case, facial recognition led to the arrest of a man accused of sexual assault of a minor. The suspected killer in the Maryland Capital Gazette shooting was also identified by facial recognition technology. The police used facial recognition because identification from fingerprints was going slowly. The Maryland police chief stated, “We would have been much longer in identifying him and being able to push forward in the investigation without that system.”
Relatedly, facial recognition generally saves police departments time and resources. Many law enforcement authorities in San Diego County agree that fingerprinting will have to be used to identify people because of California’s current ban on law enforcement usage of facial recognition technology. The time spent on fingerprinting means less time for officers to patrol and protect community safety.
One important question for algorithmic facial recognition is whether it is better than human recognition capacity. Humans versus computers are relatively even when competing at identification. While both trained individuals and machines have biases and will inevitably make errors, combining an algorithmic machine and human professionals nets a far more accurate result. If technology can be better utilized to avoid harm and professionals are aware of their own biases and that of the machines, there is a strong potential for improved recognition and optimal outcomes for law enforcement.
Consequently, facial recognition usage by law enforcement is controversial. It is strewn with errors and in some ways embodies the reduction in privacy of the digital age. Yet, it is also a time-saving tool that, when combined with human-powered identification, can keep communities safe. Without current federal policy on facial recognition technology, it is up to localities, cities, and states to decide what interest to put first. There is, however, discussion in the federal government on risks associated with both public and private facial recognition. This is a bipartisan issue, with the U.S. House Committee on Oversight and Reform having had three hearings on the matter. Rep. Jim Jordan, R-Ohio summed up the hearings: “It doesn’t matter if it’s a President Trump rally or a Bernie Sanders rally. The idea of American citizens being tracked . . . for merely showing their faces in public is deeply troubling.” Rep. Alexandria Ocasio-Cortez, D-N.Y. stated “[t]his is some real life ‘Black Mirror’ stuff.” Both representatives, from across the aisle, agree that something must be done. The power of facial recognition technology has no borders, and its use by law enforcement is both a solution and a problem produced by the digital age.
 As of February 29, 2020