City Council legal adviser: Reject Detroit police use of facial recognition technology

George Hunter
The Detroit News

Detroit — The head of the City Council's legal team is recommending the council deny Detroit cops the ability to continue using facial recognition software, should the issue come to a vote.

David Whitaker, director of the council's legislative policy division staff, expressed in a Sept. 6 memo his concerns that police could abuse the software, and that white juries would be unable to render fair verdicts in trials with black defendants whose photos had been flagged by facial recognition technology.

Detroit police chief James Craig said the fears are unfounded, pointing to what he called a "rigorous process" that he says guards against abuses and wrongful identifications.

Craig said one of Whitaker's concerns in the memo — that police could scan faces in real time — is based on a provision in an early policy draft, which has since been removed.

The chief added: "I question (Whitaker's) research, since he never bothered to reach out to us to see how our system works. How can he say he has problems with our system, when he hasn't even seen it?"

Detroit police chief James Craig addresses the media regarding use of facial recognition software

Whitaker declined to comment. Councilman Roy McCalister, who in a July memo asked Whitaker to research the issue of police use of facial recognition technology, did not return phone calls.

Detroit police have been using the software for about a year and a half, guided by standard operating procedures. After the department in June asked the Detroit Board of Police Commissioners to approve a permanent policy governing use of the software, it sparked an often-heated debate.

Critics say the technology opens the door to privacy violations, and point to the fact that many facial recognition systems falsely identify people with dark skin.

Craig said those concerns are mitigated by the checks and balances built into the procedure, which requires two technicians and a supervisor to sign off on any images the computer matches to photos scanned into the system before they're forwarded to an investigator.

"Anyone who abuses the system will be held accountable, which will result in termination, and possible prosecution," Craig said.

Craig and Mayor Mike Duggan have insisted the technology will only be used after a violent crime has been committed — and not to scan people's faces in real time — but in his memo, Whitaker said he was skeptical.

"The Police Department's argument that its facial recognition technology is a 'mere investigative tool,' like the Mayor's argument that the Police Department under his supervision will not be abusing the technology, are ultimately unconvincing," Whitaker wrote.

He added: "The potential for other public and private agencies to abuse this complex and secretive technology, and its potential to generate false convictions, cannot be dismissed."

Craig pointed out that the City Council unanimously approved the $1 million software purchase in July 2017, and said there's been nothing secretive about the process.

"People keep saying this is a secret program, but there was a discussion about it in an open council meeting," Craig said. "The council members' concerns were aired out, and after a discussion they voted to approve it. How do you have an open meeting, and then vote to approve $1 million, and do it in secret? That's crazy."

Per the City Charter, the police board must approve all Detroit police policies. The board has discussed at recent meetings the pros and cons of police using facial recognition technology, although no vote has been scheduled.

Craig has tweaked the initial proposed policy he submitted in June, removing a provision that would have allowed police to use the technology to scan faces in real time if there was a credible terrorist threat. The chief said federal authorities would instead handle those situations.

If the board votes for the policy proposal, it would then go to the City Council for approval — which Whitaker recommended against, in part because of what he said is a racist criminal justice system.

"In a hypothetical heinous criminal case, where an African American defendant stands before an all-white tribunal ... the claim that facial recognition technological identification would be insufficient to support a conviction flies in the face of everything we know about race in the US justice system," Whitaker wrote.

Craig replied: "I don't even know what he means by that. I've said it over and over: We will not seek charges against someone based solely on a facial recognition hit. So nobody would ever get to court with facial recognition identification as the only evidence."

In his memo, Whitaker claims police already have opened the door to expanding the software's use. 

"Multiple recent written orders from the Chief of Police ... indicate that the possibilities of using this technology for surveillance and sharing data with other security entities has been considered permissible to a significant extent," Whitaker wrote.

Whitaker attached to his memo directives from Craig discussing real-time use of the software on traffic-mounted cameras in case of a credible terror threat.

"That part has been removed," Craig said. "So he's basing his criticism on an early draft of the policy." 

McCalister on July 3 sent a written request to Whitaker asking him to conduct a "comparative study" of police use of facial recognition software.

McCalister was not yet elected when the council voted in July 2017 to approve $1 million to purchase the software from DataWorks Plus of Greenville, South Carolina. The contract to provide the software and support expires next year.

Craig in recent weeks has invited community members, police commissioners and others to tour the Real Time Crime Center to see how the system works. He said he's gotten positive feedback from people who misunderstood the technology, including police board chairwoman Lisa Carter.

During a July board meeting, Carter said she opposed the software because of its potential to misidentify photos of black people fed into the computer, but after taking a tour of the crime center, she told The News her fears were allayed.

(313) 222-2134

Twitter: @GeorgeHunter_DN