A Black man was falsely arrested and jailed in April for allegedly flashing a woman in New York City despite not matching the victim’s physical description of the suspect, based on a misidentification by facial recognition technology.
His lawyers are now calling for an investigation into the use of the technology by New York police, while civil rights advocates are demanding an outright ban.
Trevis Williams, 36, who is 6 feet 2 and weighs 230 pounds, spent more than two days in jail after a facial recognition program used by New York Police Department investigators selected his image from an array of mug shots in a database. Then a woman who had told police in February that an Amazon deliveryman had exposed himself to her in a Manhattan apartment building reviewed it among five other photos culled by police and identified Williams as the flasher, The New York Times first reported.

It didn’t matter to police that the woman had described the suspect as 5 feet 6 inches tall, and weighing about 160 pounds. Or that Williams, whose mug shot was in the system because of a misdemeanor arrest over a fighting incident, told police he was driving back from his job in Connecticut, where he worked with autistic adults, to Brooklyn to meet a friend on the day of the crime, and could prove it with location data from his phone.
Both men were Black, had thick beards and mustaches, and wore their hair in braids.
During an interrogation on April 21, Williams, who reportedly was nabbed by police when they stopped him on suspicion of fare-beating in the subway, told investigators he had started working for Amazon delivering packages on April 1, and that sealed his fate. Despite his insistence that he was working in Connecticut in February, police charged Williams with indecent exposure the following day.
“Traditional police work could have solved this case or at least saved Mr. Williams from going through this,” said Diane Ackerman, Williams’ lawyer and a staff attorney in the Digital Forensics Unit of the Legal Aid Society in New York.
She says the NYPD failed to investigate beyond the photo identification, which started with feeding a blurry still photo from surveillance video at the crime scene into a system that uses algorithms to render a face’s contours into data points, then looks for faces that are statistically similar, the Times explained.
A forensic analysis that his lawyers made of Williams’ phone records showed that at the time of the crime, his phone was communicating with cell towers around Brooklyn, 12 miles away from the crime scene, according to data reviewed by the Times.
The police did not contact Amazon to find out the identity of the deliveryman at the woman’s apartment. An Amazon spokesperson said the company would have cooperated.
Prosecutors dismissed the case against Williams in July, reported ABC7.
His case is among multiple false arrests based on faulty or improperly used facial recognition technology, which has prompted the Legal Aid Society to demand an investigation into how the NYPD uses the technology.
In a letter sent to the department’s Office of the Inspector General on Aug. 25, Legal Aid lawyers claimed that New York police officers are violating department policies by comparing photos of criminal suspects to photos outside of the NYPD photo repository, which contains arrest and parole photos of people charged with crimes in New York.
Police have improperly tapped into photo repositories and AI software managed and used by the New York Fire Department and the NY/NJ High Intensity Drug Area, which draw on images from all over the internet. Such searches have resulted in the illegal arrest of a pro-Palestinian student protestor and the false arrest of a woman who was eight months pregnant for carjacking, the lawyers said.
Legal Aid argued that the NYPD’s circumvention of its facial recognition policies is “particularly unnerving given the rising number of cases in which clients, like Mr. Williams, were falsely arrested based on faulty facial recognition matches. … We are gravely concerned that the cases we have identified are only the tip of the iceberg, given how devastating the consequences of this technology can be, particularly for Black individuals.”
The Surveillance Technology Oversight Project (STOP), a privacy and civil rights group based in New York, last week condemned the false arrest of Williams and renewed its call for a ban on facial recognition in New York City and the state.
“Facial recognition doesn’t just threaten New Yorkers’ civil rights, it undermines their safety,” said STOP Executive Director Albert Fox Cahn in a statement. “While we know about this one case today, we have no idea how many of the thousands of other New Yorkers arrested with this technology were wrongly accused. If the NYPD is willing to use a facial recognition result this flawed, how can they ever be trusted to police their own algorithm?”
Researchers at the National Institute of Standards and Technology have found that facial recognition technology identifies the correct person a vast majority of the time, the Times reported. But that research typically involved images taken under controlled conditions, not the grainy, blurry images drawn from surveillance footage, which is often what law enforcement investigators are working with.
Facial recognition software “misidentifies individuals in poor quality photos and disproportionately misidentifies people of color, women, the young and the elderly,” STOP said in its 2021 report documenting the NYPD’s use of the technology, ABC7 noted.
“Everyone, including the NYPD, know that facial recognition technology is unreliable,” said Ackerman. “Yet the NYPD disregards even its own protocols, which are meant to protect New Yorkers from the very real risk of false arrest and imprisonment. It’s clear they cannot be trusted with this technology, and elected officials must act now to ban its use by law enforcement.”
The NYPD said in a statement that its use of facial recognition has proven successful in thousands of cases since 2011, leading to matches in cases including rapes and murders. “Even if there is a possible match, the NYPD cannot and will never make an arrest solely using facial recognition technology,” it said.
Williams is still feeling the fallout of his rocky experience with forensic facial AI.
“In the blink of an eye, your whole life could change,” said Williams, who told the Times he has a 12-year-old son and worried that the charges could get him placed on a sex offender list, and that he would be unable to find work or to pick up his child from school. Feeling humiliated and angry, he worried that everyone he knew would find out about the shameful charges.
He said he’s still living in fear due to the ordeal.
“It’s very stressful. It’s always on my mind,” he told CBS News, adding that he is now looking into taking legal action.
Williams said that he was in the process of becoming a correctional officer at Rikers Island, and since his arrest, “they kind of froze the hiring process,” ABC7 reported.
“I hope people don’t have to sit in jail or prison for things that they didn’t do,” Williams said.
Great Job Jill Jordan Sieder & the Team @ Atlanta Black Star Source link for sharing this story.