Project Green Light to add facial recognition software
Detroit police will soon integrate facial recognition software into the department’s Project Green Light video crime monitoring program.
Police say the software will be used only to investigate violent crimes and will help get dangerous criminals off the street, although civil rights groups have expressed concern about the use of the technology by law enforcement.
Detroit police recently bought the software for $1 million from Data Works Plus LLC, Assistant Chief James White said. A three-year contract allows use of the software, technical support and maintenance, he said.
White added the department has been using the technology to investigate violent crimes for more than a year, borrowing software from other agencies.
When the new Detroit police Real Time Crime Center opens in November, White said the new software will be integrated into the Green Light program, in which high-definition video feeds from participating gas stations, convenience stores and other business are monitored by police and civilians.
“This isn’t some super-secret piece of technology,” White said. “This isn’t Big Brother, and we’re not covertly trying to monitor people. We’re not going to use it to ID everyone who goes into our Green Light locations; it will be strictly confined to investigating violent crimes.”
Detroit police will compare violent crime suspects against driver’s license photos from the Michigan Secretary of State, pictures posted on social media, mugshots of criminals and other public databases, White said.
White said a standard operating procedure is in place governing use of the software; the purchase was approved by the City Council months ago.
The rules, which will be monitored by the department’s Civil Rights Integrity Unit, prohibit anyone from using the technology for anything other than investigating a violent crime, said White, who was in charge of ensuring the department’s compliance with federal mandates under the 2003 consent decree, which ended last year.
A 2016 Georgetown University law school study found nearly half of all American adults have been entered into law enforcement facial recognition databases, creating what the researchers called a “virtual lineup.”
“At least 26 states (and potentially as many as 30) allow law enforcement to run or request searches against their databases of driver’s license and ID photos,” the report said.
Michigan is among those states. Michigan State Police run the Statewide Network of Agency Photos, “a database of 4 million mug shots and 41 million driver’s license and ID photos from the Michigan (Secretary of State),” the report said.
Shelli Weisberg, legislative director for the American Civil Liberties Union of Michigan, said she has concerns about police using the technology that were not alleviated after she studied the facial recognition program.
“First of all, it’s shocking how inaccurate it is,” Weisberg said. “When MSP showed me their program, they put my face in and brought up a number of false positives. Falsely identifying people as criminal suspects could lead to a host of other potential issues.”
Weisberg said the technology also could be less accurate in identifying minorities. “The programs seems to have a population bias,” she said. “I think the bias comes because you have more white faces to use as the models for perfecting the technology. Bone structure is different between races, and they’ve perfected the technology to the majority structure.”
A 2011 University of Texas at Dallas study reached a similar conclusion: “When face sets contain more faces of one race than of other races, it is reasonable to expect recognition accuracy differences for faces from the ‘majority’ and ‘minority’ race,” it said.
The Georgetown study also found face recognition technology is less accurate on African-Americans, women and young people. In 2015, Google apologized after its Photos application mistook black people for gorillas.
“In the criminal justice system, which is already discriminatory toward people of color, if false positives are going to bring more people of color into that system, it’s going to increase the probability they’ll suffer in a system that doesn’t treat them justly in the first place,” Weisberg said.
State police policy states that the software cannot be used to establish positive identification of a suspect and results can be considered investigative leads only. It also says mobile facial recognition devices may be used only in limited, specified situations.
“We have a robust policy and auditing process to ensure compliance,” state police spokeswoman Shanon Banner said. “We also restrict access to only trained law enforcement personnel.”
Banner added: “Since Jan. 1, 2017, we’ve utilized facial recognition to provide 417 investigative leads to law enforcement. Note that per our policy, an investigative lead is not a positive identification; it is an investigative lead only and is not probable cause for an arrest. These investigative leads have assisted in the identification of suspects wanted for armed bank robbery, human trafficking, identity fraud, homicide and retail fraud.”
Jerome Morgan, 66, who lives on Detroit’s east side, said he doesn’t have a problem with police using the software.
“I’m 100 percent behind it, if it helps lock up the criminals,” he said.
Both White and Weisberg pointed out that facial recognition software is already widespread. The iPhone X and Galaxy S8 phones use the technology to unlock the devices.
“It seems to be a bit out of control and it seems to be a very problematic road to travel down,” Weisberg said.
White said he understands the civil rights concerns, but insisted Detroit police will not misuse the software.
“I would say to those concerned that this is an investigatory tool that will be used solely to investigate violent crimes, to get violent criminals off the street,” he said. “From that standpoint, the community should be comforted.”