Tlaib: Use only blacks as Detroit’s facial recognition analysts
The Congresswoman, concerned about facial recognition software, made the remark during a tour of Detroit's Real Time Crime Center The Detroit News
U.S. Rep. Rashida Tlaib told Detroit police Chief James Craig he should employ only black people on the department’s facial recognition team because “non-African Americans think African-Americans all look the same.”
The Detroit Democrat made the statement during a tour of the Real Time Crime center, where monitors display live footage from video cameras on traffic lights and in and around businesses.
A day after Tlaib made the comment Monday, her spokesman said she was trying to convey the importance of accurately identifying black suspects in a city with an African American population of about 80%.
Police officials invited Tlaib to the facility inside Public Safety Headquarters to see how Detroit uses facial recognition software, after she criticized the technology in an Aug. 20 tweet. The congresswoman wrote: "@detroitpolice You should probably rethink this whole facial recognition bull----."
The tour, which lasted more than an hour, was often tense, with Tlaib and Craig wrangling over how the department uses the software, privacy issues, and concerns that the technology misidentifies a disproportionate number of darker-skinned people. A major point of contention: whether only black civilians should work in the crime center analyzing photos flagged by the software.
“Analysts need to be African Americans, not people that are not,” Tlaib told Craig. “I think non-African Americans think African Americans all look the same.
“I’ve seen it even on the House floor: People calling Elijah Cummings 'John Lewis,' and John Lewis 'Elijah Cummings,' and they’re totally different people,” Tlaib said, referring to the two longtime Democratic congressmen. “I see it all the time, and I love them because they go along with it.”
Craig replied: “I trust people who are trained, regardless of race; regardless of gender. It’s about the training.”
“I know,” Tlaib answered. “But it does make a huge difference with the analysts."
After the tour, when a reporter asked whether she meant white people weren't qualified to work in the crime center, Tlaib said: “No, I think there has actually been studies out that it’s hard for — African Americans would identify African Americans, or Latinos, same thing.”
Tlaib then was asked whether that means non-whites should be barred from working as crime analysts in mostly white communities. She replied: “Look it up.”
On Tuesday, Tlaib’s spokesman Denzel McCampbell provided a link to a study by Memorial University of Newfoundland psychology professors that suggested people using facial recognition technology more accurately identify members of their own race.
"The studies (Tlaib mentioned are) related to cross-race effect or other-race effect," McCampbell said in an email. "This has shown that individuals are less accurate when identifying people from a race other than their own.
"Detroit has a black population of more than 80%, so that is where her basis came from and what she was trying to convey when it comes to accurate identification," McCampbell said.
Researchers have studied the "cross-race effect" for years.
In a 2008 report about eyewitness testimony, researchers Sheree Josephson of Weber State University and Michael E. Holmes of Ball State University studied 40 people of different racial backgrounds who watched video of a crime in progress. In the next 24 hours, they were asked to pick the suspect out of a photo lineup.
Most of the participants misidentified the suspect, or said they didn’t see him in the lineup. Correct identifications happened more often when the suspect and witness shared the same race, the study found.
Craig said several members of his staff, black and white, told him after Monday's meeting they were outraged by Tlaib’s remarks.
“It’s insulting,” Craig said. “We have a diverse group of crime analysts, and what she said — that non-whites should not work in that capacity because they think all black people look alike — is a slap in the face to all the men and women in the crime center.”
Craig said all officers and civilian employees go through mandatory implicit bias training.
“That’s something we train for, and it’s valuable training, but to say people should be barred from working somewhere because of their skin color? That’s racist.”
The Detroit City Council in July 2017 unanimously approved $1 million to purchase the facial recognition software from DataWorks Plus of Greenville, South Carolina. The contract to provide the software and support expires next year.
The Detroit Board of Police Commissioners approved a policy governing the software Sept. 19 after months of contentious debate, with critics arguing that the technology can misidentify people with darker skin.
During Monday's tour of the crime center, Tlaib repeatedly expressed concern that the technology flags an inordinate number of darker-skinned people and raised privacy fears, saying she was "taken aback" during recent congressional meetings on the subject.
"Even my Republican colleagues were very concerned about facial recognition, because all this information in databases offers no protections for our residents," she said.
"We don’t disagree with the concerns," Craig said. "If you'd be patient with me and let me go through this.”
Craig showed a photo of an African-American woman whose mugshot was mistakenly flagged after a photo of a black male shooting suspect was fed into the computer.
The chief started to explain the next steps in the process, but Tlaib cut in: “We know it’s close to a 60% error rate because it doesn’t identify black people; you know that, Chief. Chief, the error rate among African Americans, especially women, 60%."
"I understand the technology," Craig said. "That's why I’m taking you through it personally."
"I know," Tlaib said. "Just see if you can get some of our money back before we fix it."
“No,” Craig replied.
Tlaib then asked whether a facial identification hit is ever the sole evidence used to bring charges against someone, and Craig told her no.
“A match is a tool only,” Craig said.
Craig went on to tell Tlaib that after the software misidentified the woman, crime analysis supervisor Andrew Rutebuka saw it wasn’t a match and moved on to other photos.
Rutebuka then put an old police mugshot on the screen, which he said did match the suspect, 21-year-old Davevion Dawson, who is awaiting trial on felonious assault and weapons charges.
“That’s him,” Rutebuka said.
Tlaib fired back: “How do you know? You can’t say it’s him; it’s allegedly him. That’s the lawyer talk in me. It’s his life we’re talking about.”
"Let me stop you right there," Craig interjected. "It's his life, but guess who else’s life? The victim’s. We never talk about the victims. What about that victim's rights? What about the family of the victim? What about their justice?"
"Do you guys have witnesses as well?" Tlaib asked.
"Yes," Craig said.
"The warrant wasn't issued solely based on this, was it?" Tlaib asked.
Craig sighed. "No, it wasn’t," he said. "Work with me. Work with me."
After the presentation, Tlaib was asked if any of her concerns were allayed. "No, I mean … we’ve got to make sure that it’s not expanded … on public housing … these are where people live, and it would be very dehumanizing to have surveillance at your own … home," she said.
Tlaib in July joined Democratic U.S. Reps. Yvette D. Clarke and Ayanna Pressley in filing the “No Biometric Barriers Housing Act of 2019,” which would prohibit the usage of facial recognition technology in most federally funded public housing.
After Monday's tour, police board member Annie Holt, who was among those present, said she thought the presentation would change Tlaib's mind.
"I really appreciate the energy and the commitment of Congresswoman Rashida Tlaib, and I'm expecting that the impression she received today will help her revisit some of the notions she previously had about the technology," Holt said.
When asked whether she agreed with Tlaib's comments about whether only black people should work as crime analysts in Detroit, Holt said: "I'm not going to speak to that. I didn't hear it, and it serves no purpose in terms of what we're trying to accomplish as a city and as a nation."
Michigan State Police Lt. Michael Shaw said there are many misconceptions about facial recognition software, and said people mistakenly think it misidentifies people based on skin color.
"The software doesn’t work on gender or race; it works on facial measurements," Shaw said. "If you have a good photo, the photo array will come back with people with the same facial makeup, whether they're male, female, black, white or whatever. It's all about things like the spacing between the eyes, or where the ears are on the side of your head.
"That's where the human element comes in," said Shaw, whose agency has used facial recognition technology for more than 17 years. "If you know you're looking for a white male, and the system kicks out a white female, or a black male, or whatever, then a technician will flag that. Nobody uses the software on its own."
Shaw said state police train constantly to guard against bias.
"There's implicit bias in everything we do," he said. "You just heard it in (Tlaib's) comments."