Nearly a decade ago, the Minneapolis Police Department started mounting automatic license plate readers on squad cars, presumably to aid with traffic enforcement. Then a Star Tribune investigation revealed that the MPD had purchased the technology without realizing (or perhaps not caring) that the data would be public.
MPD turned over more than 2 million plate scans — without any infrastructure to protect people's privacy rights. Anyone could learn your daily routine by getting your license plate number.
During the recent push to ban facial recognition technology (FRT) in Minneapolis, we mentioned this story frequently when talking to lawmakers and activists. It highlights a fundamental truth: Surveillance technology is both incredibly powerful and dangerous. Nowhere is this clearer than with FRT.
A recent Government Accountability Office (GAO) report revealed that at least six federal agencies used FRT to analyze images from protests over the police murder of George Floyd. These agencies claim the searches focused on images of individuals "suspected of violating the law" — the same justification used to pin George Floyd to the pavement. Government violations of our civil liberties always come clothed in the guise of order and public safety, which is why we must ask hard questions when we see the government gaining access to tools that expand its power.
It was clear from last week's Congressional hearing on police use of this technology that we have far more questions than answers. But simply requiring more transparency is insufficient. Knowing the government has abused our civil liberties and caused real harm, the solution is not to require it to keep better records of its abuse while we mull over the problem.
If the lack of a legal framework for such basic things as requiring record-keeping and evaluating the technology's effectiveness was not enough to forestall its use, the clear and well-documented technological deficiencies compounded with dangerous racial and gender biases in how police use surveillance technology make it a moral imperative.
The stories of Robert Williams, Nijeer Parks and Michael Oliver highlight this risk. All three are Black men arrested and detained based solely on a misidentification from a computer. During the public hearing on the Minneapolis FRT ban, Council Member Steve Fletcher put it succinctly:
"One of the biggest reasons to proceed with a ban on the technology is that it shows significant racial disparities, and in fact the further you get away from the middle-aged white men who are centered in the creation of the algorithms, the more likely you are to have error."