Detroit council expected to vote on facial recognition software contract
Detroit — City Council is expected to vote next week on a contract for controversial facial recognition software that opponents argue is "racist" and flawed.
The measure — to cover costs associated with upgrades and maintenance — is set to return to the council table after members delayed action in June, citing a need for Detroit police to engage the community on the technology.
Residents and activists urged council members Monday to reject the software support agreement and do away with facial recognition.
During a Monday meeting of the council's Public Health and Safety committee, social justice advocate Tawana Petty argued facial recognition is a means of "social control" that further marginalizes an already marginalized community.
"Detroit should not be doubling down against Black residents," Petty said. "I pray you get on the right side of history."
Facial recognition and Green Light surveillance camera technology in crime-fighting have been contentious in Detroit and cited by such groups as Detroit Will Breathe, a coalition that assembled in the wake of the Memorial Day killing of George Floyd to march against police brutality.
Claire Bowman spoke Monday on behalf of the group, reiterating the demand the "racist technology" be discontinued.
The nearly $200,000 contract with South Carolina-based Dataworks Plus would fund software maintenance and support for the department's facial recognition equipment. If approved, the contract would run from Oct. 1, 2020, through Sept. 30, 2022.
Detroit's City Council first approved a two-year, $1 million contract for facial recognition software in 2017. The council committee on Monday sent the measure to the full council for a vote on Sept. 29, with a recommendation to approve it.
Detroit Police Capt. Aric Tosqui told council members Monday the department convened multiple community engagement sessions on facial recognition prior to the COVID-19 outbreak and has continued hosting them via Zoom in recent months.
Tosqui noted the department owns the software in perpetuity and is continuing to use it to help solve crimes. But police need the council's approval on the contract to ensure it's properly updated and maintained.
"When folks from the community are saying that facial recognition is adding to an increase in surveillance of the community, that's just not the case," he told the committee. "Facial recognition is there to help solve crimes that would have been whodunits without the use of the software."
The software, he said, is used for face matching with a digital book of mugshots. Once a potential hit is identified, it's then up to detectives to investigate further.
Detroit Police Chief James Craig has said the department has always been willing to engage the community and began doing that about a year ago.
"We invited community members and elected officials into the Real Time Crime Center to see how the technology works," he told The Detroit News last month. "The Real Time Crime Center is under construction now, so we’re limited as far as visits go, but that doesn’t mean we still won’t have an open discussion with anyone who wants to talk about it.”
A December review of the industry’s leading facial recognition algorithms by the National Institute of Standards and Technology found they were more than 99% accurate when matching high-quality headshots to a database of other frontal poses.
But trying to identify a face from a video feed, especially using ceiling-mounted cameras commonly found in stores, can cause accuracy rates to plunge. Studies also have shown that face recognition systems don’t perform equally across race, gender and age — working best on white men, with potentially harmful consequences for others.
In one high-profile case, Robert Williams, who is Black, said he was mistakenly tagged by facial recognition as a suspected shoplifter in Detroit in 2018.
The American Civil Liberties Union filed a complaint with the Detroit Board of Police Commissioners seeking a public apology from police, permanent dismissal Williams' case and removal of Williams' information from criminal databases.
Tosqui said critics of the software point to issues that arose in the past. The department now has a "strict policy" for its use in connection with the most egregious violent crimes and home invasions.
Police officials revised the policy governing use of the software last year, removing a contentious provision that allowed it to be used to scan faces in real-time if there's a terror threat. The revisions also laid out punishment for officers who abuse the system.
The rules were adopted last fall after it had been in use by the department for a year-and-a-half.
"We're not going out and using it for misdemeanors or surveilling other people," he said.
Detectives, he added, have undergone extensive training "so they know that this is just a lead only."
"If we get a match, that detective still has to go out there and corroborate that," he said. "They have to do extreme due diligence to figure out if that lead is the right lead."
Councilwoman Janee Ayers said Monday she understands the dueling perspectives on facial recognition.
"But I also know that we have an obligation to do what's right for the masses," she said during the committee session. "We have a city of 700,000 people, and everybody deserves to feel safe and have every tool in the toolbox at their disposal to get whatever resolution they need."
Without the contract to support software upgrades, the technology would become antiquated and less reliable, police department officials said Monday.
The technology has been used 105 times this calendar year and it made 62 matches, police said.
Ben VanderSloot, an assistant professor in computer science at the University of Detroit Mercy, still warned the council of the dangers that come with its use.
"Don't believe facial recognition is automatically going to save anything," he said Monday. "It's causing more problems than it's worth."