Proposed ban on use of facial recognition technology by police advances in Minneapolis

Council member who backs prohibition says its use "feels dystopian."

January 22, 2021 at 7:17PM

Minneapolis could soon join a small but growing number of American cities that have banned most uses of facial recognition technology by its police and other municipal departments.

The proposal, which has been quietly discussed for months among a coalition of progressive groups including the American Civil Liberties Union, was signed off on without discussion by a City Council committee Thursday. The matter will next be taken up at a public comment session on Feb. 10 before going to the full council for a final vote on Feb. 12.

"If we have cameras all over the city tracking in real time, and keeping a record in real time of where everybody goes, that feels dystopian to me and that feels like it's open for abuse," Council Member Steve Fletcher, a champion of the proposed ordinance, said in an interview Thursday.

If the proposal were to pass, Minneapolis would follow cities like Los Angeles, Boston and Portland, Ore., which last fall passed what's considered among the most sweeping bans in the country.

Fletcher said the proposed ban would prohibit the city from obtaining or using — directly or through a third party — the technology, which employs machine learning algorithms to automatically detect human faces from surveillance cameras, social media and other sources against an expansive countywide mug shot database.

But unlike Portland's ban, the ordinance wouldn't apply to outside law enforcement agencies operating within the city, such as the Hennepin County Sheriff's Office, or to private businesses, he said. Some exceptions would also be made for certain civilian uses.

Minneapolis police spokesman John Elder said Thursday that the chief will review the policy's language when it's final.

While Twin Cities law enforcement agencies are still behind other jurisdictions in using facial recognition tools, the growing demand for their services has set off alarms in privacy and civil rights circles, where many worry that the technology's demonstrated trouble with identifying people of color could widen existing criminal-justice disparities. Others see it as a dangerous step toward a surveillance state, where people's movements are tracked the moment they leave their homes.

But many in law enforcement have defended the technology as too important a tool to ignore in an increasingly wired world. With the help of facial recognition, even a grainy image captured on a security camera or social-media account can lead investigators to a suspect who might otherwise have gone undetected. The technology's potential has been on display in recent weeks as private firms used it to help the FBI identify supporters of former President Donald Trump who stormed and ransacked the U.S. Capitol, according to reporting by Reuters.

In Minnesota, observers say that efforts to address the technology's use at the state level are still in their infancy.

The Minneapolis Police Department, by far the agency's biggest client, has for years deflected questions about its use of the technology. In 2018, a spokesperson told the Star Tribune that the department had no plans to use the technology.

But the county records now show that MPD personnel have accessed facial recognition software hundreds of times over the past five years. The records show that between Oct. 1, 2015, and Sept. 28, 2020, MPD investigators used the system's software 237 times — a number dwarfed by other big city departments across the country.

In a white paper published last year by the Police Conduct Oversight Commission, the MPD said it wasn't using facial recognition with the body cameras worn by its officers, arguing that such technology doesn't exist, although the paper's authors pointed out that the cameras' manufacturer, Axon, "is developing automated solutions."

Fletcher previously mentioned the need for tight regulations, although at the time he wasn't advocating for an outright ban similar to those passed in places like San Francisco and Somerville, Mass. He said then that he wanted to study the issue further "to create a framework where we can invite technology in and take advantage of technology without creating a digital trail about individuals that becomes searchable and infringes on their privacy."

On Thursday, Fletcher said that his thinking about the technology had changed in light of research showing "really deep racial bias" and called its potential for exacerbating existing disparities in the criminal justice system "unacceptably high." A United Nations committee earlier this year warned that rapidly evolving technologies like facial recognition risk reinforcing racism and xenophobia and creating a "feedback loop," for example by drawing on historical arrest data that may reflect racially biased policing.

In many ways, facial recognition tools are already part of everyday life, transforming the way people check in at airports, unlock smartphones or tag their friends in photos on social media.

The problem is that cities are quick to embrace new and emerging technologies — and the promises they hold — without first considering appropriate safeguards, according to Sarvesh Ramprakash, an organizer with the POSTME coalition (Public Oversight of Surveillance Technology and Military Equipment), which has been working with Fletcher to pass the ban.

Without strong regulation, Ramprakash said he worries about the technology's use beyond criminal investigations, for example the identification and potential prosecution of opponents of the Enbridge Line 3 oil pipeline replacement in northern Minnesota.

"The big thing is there is a difference between proper operating procedure with transparency and dragnet surveillance — obviously that gray area is porous," he said.

"At an even more fundamental level … is this idea that the community should be the one to decide what tools law enforcement should have access to," said Chris Weyland, another POSTME coalition member who worked to write the draft ordinance. "The community should be able to say, we're cool with drones, but not facial recognition."

But efforts to regulate the technology have to go beyond a policy debate to evaluating the very machine learning algorithms that power the technology, he says.

"We can put in place policy guardrails, but the technology hasn't advanced to the point where the racial biases and the accuracy levels is good enough for me to feel comfortable saying all we need is a policy in place," he said.

Libor Jany • 612-673-4064

Twitter: @StribJany

about the writer

about the writer

Libor Jany

Reporter

Libor Jany is the Minneapolis crime reporter for the Star Tribune. He joined the newspaper in 2013, after stints in newsrooms in Connecticut, New Jersey, California and Mississippi. He spent his first year working out of the paper's Washington County bureau, focusing on transportation and education issues, before moving to the Dakota County team.

See More

More from Local

card image

Republicans across the country benefited from favorable tailwinds as President-elect Donald Trump resoundingly defeated Democrat Kamala Harris. But that wasn’t the whole story in Minnesota.

card image