Detroit cops revise proposed facial recognition policy, remove real-time provision
Detroit — Police officials have revised the proposed policy governing the use of facial recognition software, removing one of its most contentious provisions: The ability to use the technology to scan faces in real time if there's a terror threat.
The revised policy also lays out punishment for officers who may abuse the system, police chief James Craig said.
Craig said he pared down the proposed policy from 10 to two pages, and earlier this week submitted the revisions to the Detroit Board of Police Commissioners for review, although the board was not expected to vote at Thursday's meeting whether to approve the policy.
"We have never used this for live-streaming, but we originally had it put in the policy that we would use it that way in case of a credible threat of a terrorist attack," Craig said. "But we removed that provision; now we'll only use it with still images.
"If there is a credible terror threat, the feds would lead the investigation anyway, so now it'll be up to them how they want to identify suspects," Craig said.
"Another area we tightened up: the police commission wanted to know specifically what would be the response if someone were to violate the policy," Craig said. "The new policy calls for dismissal and possible criminal charges if anyone abuses it."
The Detroit News reported in 2017 that Detroit police planned to use facial recognition technology. In July 2017, the City Council approved a $1 million contract with DataWorks Plus of Greenville, South Carolina. The contract expires July 17, 2020.
Police had been using the technology governed by standing operating procedures. When Craig in June asked the board to approve a permanent policy, the issue became contentious, with dozens of people coming to board meetings to discuss their concerns.
Under the city charter, the police board must approve a permanent policy governing use of the technology. The board was set to vote on the issue at its June 27 meeting, but the issue was tabled until the next meeting on July 11. A few days before that meeting, Craig said he wanted to adjust the policy, so the vote was removed from the board agenda.
Police commissioner Willie Burton was arrested at the July 11 board meeting after he expressed concern about use of the technology, along with asking new board chairwoman Lisa Carter if she would chair the board differently than during her term in 2017-18.
Burton on Wednesday criticized the police department for using the technology without first getting permission from the board.
"We're the oversight body," he said. "They should have come before the board, rather than using it without going through the process."
Craig replied: "It was mentioned during a recent City Council meeting that it would’ve been more appropriate for me to create a policy and have it approved by the commission, and then bring the software to the council for their approval. Had I known the reaction this was going to get, I maybe would have done it that way.
"We have an overarching policy, a data-sharing agreement, which covers all the technology we use, but it isn't specific to facial recognition," Craig said. "This was not an attempt to do anything secretly; we brought it before the council and they had a round of questions which were answered (by police officials before the software purchase was approved in July 2017).
"The thing is, we don't always publicize the tools we use," Craig said. "Many times our work is confidential, and we always want to be concerned about the impact it has on future prosecution, so we don't always talk about our methodologies."
Earlier this week, concerns about the technology were discussed at a meeting at King Solomon Baptist Church in Detroit. The meeting was organized by state Rep. Isaac Robinson, D-Detroit, who recently introduced House Bill 4810, which would prevent police in Michigan from using facial recognition technology for five years.
In an effort to educate the public about the technology, Craig in recent weeks has been giving tours of the Real Time Crime Center, where the software is used, so people can better understand how it works. Craig said reporters, citizens, police commissioners and ministers have toured the center, with more tours planned.
"We've gotten a tremendous positive response from most people who've gone through there," Craig said. "People have an 'a-ha' moment when they see how this technology is used."
Police commissioner Willie Bell is among those who took the tour. "I was skeptical, but now I support it," Bell said. "(Police commissioners) were impressed. Once you see how this works, it takes all the myth out of it."
Bell said the board is not expected to vote on approving the policy at Thursday's board meeting. "Maybe there will be a vote (at the Aug. 8 meeting)," he said.
Per the proposed policy, which is posted on the city's website, the software may be used only after a violent crime or home invasion occurs. Still images taken from video or other sources are fed into the software, which produces several possible matches.
The software draws from a pool of thousands of social media pictures, mugshots and the Michigan Secretary of State database of driver's license photos.
Two FBI-trained analysts in the Real Time Crime Center then go through the possible matches and decide if the photo of the suspect committing the crime looks like the picture the software produced.
If both analysts agree, they must then get a supervisor to concur before the information and photo is forwarded to the detective in charge of the case, the chief said.
"Even then, after the detective gets it, they can't use just that to make an arrest," Craig said. "There has to be other evidence. So as you can see, there are plenty of safeguards built into this process."
Burton, a vocal critic of the technology, said he has not taken the tour. When asked if he plans to visit the crime center and learn how the software works, Burton said: "Nobody told me about it."
"Facial recognition is techno-racism," Burton said. "Other cities, including Cambridge, Mass., and San Francisco are banning it because it misidentifies black and brown people. So why is Detroit — America's blackest city — so dead-set on using this?"
Craig said he's aware some algorithms are prone to false hits with photos of people with darker skin, but said that's mitigated by requiring two analysts and a supervisor to agree on whether photos picked by the computer match the person photographed committing a crime.
"There's a lot of rigor attached to this process," Craig said. He said since the department began using the technology about a year-and-a-half ago, 500 photos have been fed into the software. Only 30% of those were forwarded to detectives, he said.
"The analysts couldn't agree on an ID, so that's where it stopped," Craig said.
Craig added the technology will not be used to catch people who are in the United States illegally — a concern that's been aired out at recent board meetings.
"Unequivocally: We will not use this for purposes of immigration," he said. "That's not our purview, and it certainly will not be used for those types of investigations."