Michigan man misidentified by facial recognition technology warns Congress of risks posed by police surveillance


A Michigan man who was wrongly arrested after being misidentified by facial recognition technology joined expert witnesses who warned a congressional subcommittee about the pitfalls of police surveillance software used to find suspects.

Facial recognition technology used by the Detroit Police Department mistakenly matched Farmington Hills resident Robert Williams to a surveillance photo of a suspected shoplifter. Williams, who testified before a House panel Tuesday about his experience, was detained by police for 30 hours and arraigned before his charges were dropped due to insufficient evidence.

Williams said being arrested in front of his family created lingering trauma for his young daughters. Williams said once he was detained, detectives showed him photos of another man that shared no resemblance to himself.

“I held that piece of paper up to my face and said ‘I hope you don’t think all Black people look alike,’” Williams said.

Civil rights advocates say Williams is the first documented example of a person wrongfully arrested based on a false identification by facial recognition technology. The case raises concerns about how flawed databases and disproportionate error rates in identifying Black and brown people can lead to wrongful arrests and invasions of privacy.

Using surveillance technology to identify suspects drew bipartisan criticism from members of the U.S. House Subcommittee on Crime, Terrorism, and Homeland Security. Representatives expressed concerns about the expansion of a “surveillance state” and drew comparisons to how the technology has been misused by the Chinese government to monitor its citizens.

“There are many unknowns, but we can be certain of one thing: Most, if not all facial recognition systems are less accurate for people of color and women,” said U.S. Rep. Sheila Jackson Lee, D-Texas. “For the most part, we can be confident that the darker your skin tone, the higher the error rate.”

Jackson said little thought has been given to the consequences of expanding the use of surveillance technology across the country. Members of Congress said the technology can be a valuable tool in solving crimes but also presents issues that need further study and government regulation.

Greta Goodwin, director of homeland security and justice at the Government Accountability Office, highlighted a recent report that found 20 federal agencies are using facial recognition technology. Thirteen of those agencies don’t know what systems their employees are using or how often they use them, she said.

Facial recognition software was used to monitor protesters in demonstrations against police brutality and the 2020 election in Washington, D.C., according to the subcommittee. Surveillance technology was also used by a Georgia city to monitor compliance with COVID-19 regulations.

Bertram Lee Jr., counsel for the Leadership Conference on Civil and Human Rights, said 133 million American adults are included in facial recognition networks. Safeguards to ensure the technology is used responsibly are largely nonexistent, he said.

“Facial recognition technology dangerously expands the scope and power of law enforcement when combined with existing networks of surveillance cameras dotting our urban and suburban landscapes,” Lee said. “Facial recognition algorithms could enable governments to track the public movements, habits and associations of all people at all times, merely with the push of a button.”

The debate over facial recognition software in Michigan has largely centered on Detroit’s Project Green Light, a surveillance program that began in 2016. Civil rights groups argue the software is less accurate when trying to identify people with darker skin, creating concerns about mistaken arrests in the majority Black city.

U.S. Rep. Rashida Tlaib, D-Detroit, argued facial recognition software is “racist technology” after the Detroit City Council voted 6-3 vote to renew its facial recognition technology contract last year.

A Detroit Police Department report on how the technology has been used in 2021 found facial recognition software created 62 leads in criminal investigations. All but one of the leads identified Black people, but only 37 were considered a “possible match.”

Only 27 of the leads were generated from images created by Project Green Light cameras, with 11 more coming from other security cameras. Twenty-four leads were generated from social media images.

Williams is suing the Detroit Police Department in federal court. The lawsuit argues his Fourth Amendment rights were violated and his wrongful arrest is in violation of the Michigan Elliott-Larsen Civil Rights Act. It seeks damages and policy changes to stop the use of facial recognition technology by Detroit police.

“What if the crime was capital murder and they just came to my house and arrested me?” Williams said. “I don’t know if I would have got a bond to get out and be free … I’d still be locked up for something I didn’t do.”

READ MORE ON MLIVE:

Michigan man sues police for wrongful arrest based on facial recognition technology

Ann Arbor considering ban on police use of facial-recognition technology

Banning use of facial recognition technology among Michigan lawmakers’ proposed police reforms

‘Shoddy’ work with facial technology led to Black man’s erroneous arrest



Source link

Leave a Reply