EVANSVILLE — Police in Evansville have enjoyed near-total freedom to deploy cutting-edge facial recognition technology with almost no oversight from judges, prosecutors or officials — a situation that has led some legal experts to raise concerns about local use of the technology.
For more than a year, key city and county leaders − and the public − remained largely in the dark about the Evansville Police Department’s use of Clearview AI software, which is regarded as possibly the most powerful facial recognition technology suite in the world.
Citing public records and interviews with EPD officials, the Courier & Press publicly disclosed details about the department’s use of facial recognition technology last week.
Aside from Clearview AI’s terms of service, no public policies specifically regulate how the EPD can use its facial recognition tools in Evansville, and details about how Clearview AI evidence is leveraged to arrest and charge criminal defendants remains murky.
More:Evansville police hope to get facial recognition software in 'years to come' | Jon Webb
Vanderburgh County Prosecutor Diana Moers told the Courier & Press she was not aware of Clearview AI, or facial recognition technology, ever coming up in local courtrooms, despite the apparent routine use of such technology by Evansville police.
Neither was Vanderburgh County’s Chief Public Defender, Steve Owens. Police also did not brief Evansville's city council about its decision to acquire and use the technology, either, according to three current council members.
For Fred H. Cate, a senior fellow at Indiana University’s Center for Applied Cybersecurity Research, that’s a problem.
“Whenever I consult with a government body, one thing I always say is, ‘You want to make the rules before you buy the technology,’” Cate told the Courier & Press. “Especially with a technology like this, where a number of cities have banned the technology.”
Hoan Ton-That, the founder and CEO of Clearview AI, told the Courier & Press his company properly trains law enforcement customers and enforces "strong auditing features" to prevent any potential abuse of facial recognition technology.
Moers: 'I don't think it hurts to be specific'
According to Bolin, EPD detectives use Clearview AI to investigate an assortment of crimes on a regular basis. Facial recognition software has proven especially helpful at cracking shoplifting cases, the police said, and it’s often used as an initial step during investigations.
But where that evidence goes − and whether it is used to help establish probable cause that a crime has been committed − is largely unknown to the public and the broader Evansville legal community.
That could be, in part, due to EPD detectives' practice of referring to Clearview AI evidence indirectly in arrest affidavits. For example, Bolin said Clearview AI results could be referred to as being obtained via “an investigatory tool."
Moers, the county’s top prosecutor, doesn’t know how the EPD cites evidence obtained through Clearview AI’s software and believes greater transparency could be beneficial to all parties.
“I think it's best practice to be very specific (in arrest affidavits)," Moers said. "I think it's always good for prosecutors and judges and police and defense attorneys to be up on what's being used, for sure."
Moers thought it could be helpful for herself to discuss how Clearview AI evidence is potentially referred to in arrest affidavits − and court records − with Bolin and Vanderburgh County Sheriff Noah Robinson.
"Me, Noah and Billy, you know, maybe we can figure out the answer to this question," Moers said.
Nathan Wessler, the deputy director of the ACLU’s Speech, Privacy, and Technology Project, said it is critical for prosecutors and the legal community to be apprised of how police use tools like Clearview AI to investigate crimes.
"If the police department is concealing its use of this technology from criminal defendants and their attorneys, or from judges, they may be endangering prosecutions," Wessler told the Courier & Press. "There can be motions to throw out evidence because of violations of people's constitutional right to get disclosure of information about the investigation."
Barry Blackard, an Evansville attorney who practices criminal law with Blackard & Brinkmeyer, said any failure to disclose "all relevant evidence in a criminal case" could "undermine the integrity of our criminal justice system."
Wessler, Cate and other legal experts the Courier & Press consulted believe public officials, prosecutors and defense attorneys should work with police to set basic policies and ground rules governing how cutting-edge investigative tools are deployed, and how any evidence gathered is presented in court.
Here are the guardrails experts and advocates want to put in place
A multitude of stakeholders have proposed a series of regulatory steps officials at the federal, state and local levels could take to ensure police use of facial recognition technology is free from bias and legal liability.
Proposals range from outright bans on law enforcement use of facial recognition technology to requirements that detectives obtain judicially signed warrants before performing searches in programs such as Clearview AI.
Jonathan Barry-Blocker, a professor at the University of Florida’s Levin College of Law who specializes in U.S. civil rights and criminal procedure, said that while each state has its own set of legal standards, properly disclosing evidence obtained via programs like Clearview AI should be the standard for police.
"Generally, the spirit is that if something is used to prosecute you − or bring you into court − you as the accused should have the right to examine it or confront the sufficiency of that evidence," Barry-Blocker said. "There is a discrepancy in how our constitutional rights have been interpreted in the context of technological innovation, like facial recognition."
The Center for Democracy and Technology, a Washington D.C.-based think tank focused on protecting civil liberties in an age of rapid innovation, laid out one potential path to regulating police use of facial recognition technology in a 2022 report.
The report, which noted that Clearview AI is used by more than 3,000 police departments − about one out of every six in the United States − argued for five key regulatory guardrails:
- 1. A requirement that the police obtain a warrant to perform facial recognition searches.
- 2. A limitation on the types of cases the police can investigate using facial recognition technology.
- 3. A ban on police using vague language to refer to facial recognition technology in court records — such as calling it an "investigative tool."
- 4. A ban on police using evidence obtained via facial recognition technology as the sole reason for arresting a person.
- 5. A ban on police using facial recognition to automatically scan the public’s faces at random and not in connection to a specific investigation.
Across the United States, a patchwork of state and local regulation governs how American police can use tools such as Clearview AI. Clay Calvert, a nonresident senior fellow at the American Enterprise Institute and a renowned expert on First Amendment law, said officials and regulators need to get up to speed on artificial intelligence-based tools.
“I think with facial recognition technology, there are some flaws built into it; the biases that it can't pick up people who are Black or darker skinned, or even women sometimes more easily than it can white people," Calvert told the Courier & Press. "So, the technology itself is not fully evolved. That raises questions about its regulation."
Ton-That, in an email to the Courier & Press, said Clearview AI's image-matching algorithm performs equally well across demographic, ethnic and racial groups, according to independent testing by the National Institute for Science and Technology.
From Calvert’s perspective, a “unified approach” to regulating police use of facial recognition technology led by federal lawmakers would be the optimum outcome for businesses, the public and the police.
"Any regulation that we have now, or we adopt, may not take into account how that technology can change," Calvert concluded.