By Karen Gullo
In a victory for transparency in police use of facial recognition, a New Jersey appellate court today ruled that state prosecutors—who charged a man for armed robbery after the technology showed he was a “possible match” for the suspect—must turn over to the defendant detailed information about the face scanning software used, including how it works, source code, and its error rate.
Calling facial recognition “a novel and untested technology,” the court in State of New Jersey v. Francisco Arteaga held that the defendant would be deprived of due process rights unless he could access the raw materials police used to identify him and test its reliability to build a defense. The inner workings of the facial recognition software is vital to impeach witnesses’ identification of him, challenge the state’s investigation, and create reasonable doubt, the court said.
The ruling is a clear win for justice, fairness, and transparency. Study after study shows that facial recognition algorithms are not always reliable, and that error rates spike significantly when involving faces of folks of color, especially Black women, as well as trans and nonbinary people. But despite heightened inaccuracy for members of vulnerable communities often targeted by the police, that hasn’t stopped law enforcement from widely adopting and using this unreliable tool to identify suspects in criminal investigations.
EFF, along with Electronic Privacy Information Center (EPIC) and the National Association of Criminal Defense Lawyers (NACDL), filed an amicus brief in this case on behalf of the defendant, arguing that the court should allow robust discovery regarding law enforcement’s use of facial recognition technology.
The court agreed. Information about the substantial risk of error in facial recognition technology (FRT) shown by the defendant’s expert witness and amici, including EFF, “provide us with convincing evidence of FRT’s novelty, the human agency involved in generating images, and the fact FRT’s veracity has not been tested or found reliable on an evidential basis by any New Jersey court,” the three-judge appellate panel said.
In Arteaga, a facial recognition search conducted by the New York Police Department for New Jersey police was used to determine that Arteaga was a match of the perpetrator in an armed robbery at a store in New Jersey.
Become Bulletproof Online Today With ZERO RISK!
Here’s how it worked. New Jersey detectives generated a still image of the suspect derived from surveillance camera footage. It was first analyzed by New Jersey investigators, who found no matches for the image in their face scan databases. The detectives then sent all surveillance footage to the facial recognition section of the New York Police Department Real Time Crime Center (RTCC). A center detective captured a still image from the footage, compared it against the center’s databases, and offered Arteaga as a “possible match.” New Jersey detectives showed his image to two witnesses, who identified him as the robber.
Despite the centrality of the match to the case, nothing was disclosed to the defense about the algorithm that generated it, not even the name of the software used. Mr. Arteaga asked for detailed information of the search process, with an expert testifying the necessity of that material, but the trial court denied those requests.
Police should not be allowed to use “black box” technology developed by private software makers in criminal cases without scrutiny. It is impossible to know exactly how the software’s algorithms reach their conclusions without looking at their source code. Each algorithm is developed by different designers and trained using different datasets.
Defendants must be allowed to examine FRT in its entirety. And that includes the description of the entire process, databases of faces used, the source code of the software, and where human input is incorporated. Though human judgment is argued to cure algorithmic errors, in reality, humans are more likely to make the same kind of errors, thereby compounding the issue of bias and inaccuracy for the same demographic groups. This is why it should come as no surprise that facial recognition searches routinely result in wrongful arrests.
The court in Arteaga appreciated this, and ruled that under the “Brady rule,” which requires the government to provide all material evidence that might exculpate the defendant, Mr. Arteaga was entitled to information regarding the FRT used. The court soundly rejected the state’s argument that it was speculative whether the FRT information would be exculpatory. It reminded the prosecution that “[g]iven FRT’s novelty, no one, including us, can reasonably conclude without the discovery whether the evidence is exculpatory”
Facial recognition is being used around the country to identify suspects, and we hope other courts recognize that the constitutionally protected right of due process demands that defendants be allowed to examine and question the reliability of this often faulty technology.
Source: EFF
Karen Gullo is an award-winning former journalist working as an analyst and senior media relations specialist at EFF, collaborating with the organization’s lawyers, activists, and technologists on strategic communications and messaging to amplify their amazing work defending civil liberties in the digital world. As a writer, editor, and former reporter with over two decades of experience at Bloomberg News and Associated Press in San Francisco, Washington D.C., and New York, Karen helps develop EFF’s responses to media inquiries, and writes press statements and releases and op-eds about EFF’s advocacy of online privacy and free speech, encryption, Fourth Amendment rights, copyright abuse, and much more. As an analyst, Karen writes blog posts and contributes to white papers on subjects ranging from student privacy and mass surveillance to private censorship, the First Amendment, and international surveillance and data protection treaties. She has worked on EFF activism projects holding social media platforms accountable for bad content moderation practices, exposing Amazon Ring’s cozy relationships with local law enforcement, and pushing for the inclusion of human rights safeguards in the Council of Europe’s revised Budapest Convention. She is also a contributing writer for feminism website SeismicSisters.com. Prior to joining EFF, Karen was a reporter at Bloomberg News from 2002 to 2015, where she broke stories about Google’s legal challenge to FBI national security letters. Before Bloomberg, Karen was a reporter for the Associated Press in New York and Washington, covering politics—including the 2000 presidential election—the Justice Department, campaign finance, federal contracting practices, and much more as a member of an investigative reporting team. Karen is the recipient of national and local journalism awards, including the Jesse H. Neal Award Business Journalism Award and the San Francisco Peninsula Press Club’s excellence in journalism awards. She grew up in Oak Park, Illinois, and resides in San Francisco.
Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE
Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, What Really Happened and GETTR.
Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.
Be the first to comment on "Victory! New Jersey Court Rules Police Must Give Defendant the Facial Recognition Algorithms Used to Identify Him"