Michigan man wrongfully accused with facial recognition urges Congress to act

Melissa Nann Burke
The Detroit News

Washington — Robert Williams said when police first called and told him to turn himself in, he assumed it was a prank call. He hung up and phoned his wife to reassure her that everything was OK. 

But it wasn't. When he pulled into his Farmington Hills driveway after work, a Detroit police car blocked him in, "as if I was going to make a run for it," Williams recalled Tuesday to lawmakers. 

Williams was arrested in front of his wife and daughters, then ages 2 and 5, that day in January 2020. The young girls cried.

When detectives came to question him in jail, they showed Williams a surveillance photo of a suspect. It wasn't him. He had been mistakenly tagged by facial recognition software as a suspected shoplifter at Midtown's Shinola store, where five watches were stolen in 2018.

"I held that piece of paper up to my face and said, 'I hope you don't think all Black people look alike,'" Williams said.

He recalled the officer told him, "I guess the computer got it wrong."

"I’ve been fighting ever since," Williams said. "I hope that Congress does something about this."

Williams, a 43-year-oldDetroit native, testified Tuesday before the U.S. House Judiciary Subcommittee on Crime, Terrorism and Homeland Security, which is examining law enforcement’s use of facial recognition technology, including the "risk to civil liberties and due process posed by this technology."

He found a sympathetic audience. Texas Rep. Sheila Jackson Lee, chairwoman of the subcommittee, said Williams' story was "stunning" and told him she was "very apologetic to you and your family for having to go through this." 

"Mr. Williams' story should inspire all of us, among others, to find a bipartisan pathway forward," Jackson Lee said. 

Williams, a logistics planner in the automotive industry, was in police custody for 30 hours. He was arraigned on a first-degree theft charge that was later dropped by the Wayne County Prosecutor's Office for insufficient evidence.

Williams' case is often cited by critics of facial recognition technology, who reference studies suggesting the software systems return an inordinate number of false hits against Black people. Williams is African American. 

Lawmakers from both parties at Tuesday's hearing raised similar concerns about the proliferation of the technologyand its algorithms that are less accurate when analyzing the faces of women, people of color, the elderly and children.

"The technology here is absolutely problematic and inconsistent," said Arizona Rep. Andy Biggs, the panel's top Republican, citing "enormous concerns." 

Several experts testifying Tuesday acknowledged legitimate uses of the technology by law enforcement but said the landscape currently is like the "Wild West" with no regulation by the government. In some cases, they said, there is a lack of or under-training by police or other law enforcement using the software. 

"When there's no regulation, we're going to have mistakes, and we're going to have (data) breaches and we're going to have all kinds of trouble," said Barry Friedman, a law professor and director of the Policing Project at New York University School of Law.

Friedman said Congress could start by regulating the software vendors directly as a product moving through interstate commerce, setting standards including that the companies themselves have to identify what are effective probe images, the right size of the databases used, and established thresholds to avoid false positives and false negatives under different circumstances.

He also said the technology should be used by law enforcement only in cases of serious crime.

Jackson Lee noted that at least 18,000 police departments across the country are using facial recognition software. 

While she applauded the use of the technology to apprehend domestic terrorists, including several of those charged in the Jan. 6 attack on the U.S. Capitol, she said use of the tool must be weighed against privacy, oversight and transparency concerns and its potentially discriminatory impact. 

"As these trends have developed, the federal government has been largely absent," she said. "What we do not know towers over what we do, and that needs to change."

A report by the U.S. Government Accountability Office last month found 20 of the 42 federal law enforcement agencies it surveyed used some form of facial recognition technology, including 14 agencies that said they use systems from outside the federal government.

Thirteen of those 14 agencies reported not having a mechanism to track what non-federal systems are used by employees, Gretta L. Goodwin, a GAO director, told lawmakers.

Several lawmakers seemed troubled by her disclosure that the agencies had to poll their employees to answer the GAO's questions about if and how they were using the technology, suggesting the agencies had not evaluated associated risks such as accuracy and privacy.

"Although the accuracy of facial recognition technology has increased dramatically in recent years, risks still exist that searches will provide inaccurate results," Goodwin said.

"For example, if a system is not sufficiently accurate, it could unnecessarily identify innocent people as investigative leads. The system could also miss investigative leads that could otherwise have been revealed."

Williams in April filed a federal lawsuit seeking damages from the city of Detroit, its police chief and a police detective for "the grave harm caused by the misuse of, and reliance upon, facial recognition technology."

The American Civil Liberties Union, which represents Williams, has said that his was "the first case of wrongful arrest due to facial recognition technology to come to light in the United States."

Robert Williams

Detroit's mayor and former police chief have acknowledged that Williams' arrest was a mistake and say that new protocols are in place to avoid similar incidents.

"What you need to do is make sure you have the right protocols, and since September there are a whole series of protocols in place that this incident would not have been possible," Mayor Mike Duggan said earlier this year. 

"But it’s unfortunate that it’s happened before Detroit had its own facial recognition, before it had its own policy, and there was really no excuse for Mr. Williams having been arrested." 

Former Detroit Police Chief James Craig has blamed poor detective work for Williams' arrest, not the facial recognition system. He personally apologized for the incident.

“It had nothing to do with technology, but certainly had everything to do with poor investigative work,” Craig has said, adding that the "sloppy work and lack of management oversight" prompted him to demote the detective's supervisor, a captain, to the rank of lieutenant.

City officials have said they would expunge Williams' record and remove his personal information from a police database.

Asked about the effect on him and his family, Williams said he's considered taking his daughter to a psychiatrist because she gets very emotional when the video of his arrest is replayed on the news. 

"She can't stand to watch it," he told Jackson Lee. "I had to go talk to a 5-year-old, at the time, about a mistake that had been made by the police. It was hard on her to understand that sometimes they make mistakes — because you tell her as a child that the police are your friend."

Williams later told Rep. Madeleine Dean, D-Pennsylvania, that the speech impediment he now speaks with stems from a series of strokes he suffered last year that one doctor attributed to heightened stress.

The case stems from the theft of five watches worth about $4,000 from a Shinola store in the Cass Corridor. A security officer reviewed video footage that showed the suspect wearing a St. Louis Cardinals baseball cap, but the man did not look into the camera, according to the lawsuit.

Michigan State Police ran Williams' photo through facial recognition software, which returned a hit for Williams. Detroit investigators then showed six photographs — including one of Williams' driver's license from six years earlier — to the security officer, who had not witnessed the incident in person, according to a complaint against the police. 

"None of us looked alike. And I was like, who put this together?" Williams told lawmakers. "I look nothing like the other guys. I was actually, like, 10 years older and all the other guys in the pictures."

Williams said he doesn't feel like he's ever received a proper explanation for what happened that led to his wrongful arrest.

"We asked them over and over and over again for a proper explanation," Williams told Dean. "They just didn't seem like they were sorry. They didn't want to be apologetic about it. They were like, it happens; we made a mistake. I'm like, that doesn't do anything for me."

Amid the controversy, the Detroit City Council last fall approved a nearly $200,000 facial recognition technology contract with South Carolina-based DataWorks Plus, which funds software maintenance and equipment support. The contract expires Sept. 30, 2022.

At Tuesday's hearing, an expert recommended a policy that would prohibit using facial recognition technology to launch an investigation, saying police should have an independent basis to do so.

“Ultimately, if they went to trial, you could utilize facial recognition technology as additional evidence,” said Brett Tolman, executive director of the conservative group Right on Crime.

“But you can’t have it the other way around, because then you have concerns about whether that evidence that is corroborating the facial recognition technology is simply influenced by the subjectivity of the investigators.”

Another witness at Tuesday's hearing worried that communities would continue to see situations such as Williams' because of the "inability to train people appropriately" in policing. 

"There has to be best practices, and there needs to be certification both, I believe, certainly at a state level and a federal level, and that has to be ongoing," said Cedric L. Alexander, former public safety director for Dekalb County, Georgia.

"Because one thing we know currently is that we don't train enough. But when you are utilizing that type of technology that can easily infringe upon someone's Fourth Amendment rights, I think we have to be very careful."

mburke@detroitnews.com

Staff writers George Hunter and Sarah Rahal contributed.