Use of facial recognition technology is on the rise in the United States, and so are attempts to ban it. Ward 3 Council member Steve Fletcher, who represents Marcy-Holmes and other neighborhoods, wants to stop that use in Minneapolis before it can begin.
While the ordinance to ban this technology is still in its drafting stages, Fletcher is working alongside the city council, the American Civil Liberties Union (ACLU) of Minnesota, the Minneapolis city attorney and technology experts. Cities like Portland, Boston and New York City already banned facial recognition technology in recent months after increased use to identify protesters during social justice demonstrations.
Studies have shown the technology to be largely inaccurate and racially biased.
“There are good reasons to think twice about how much information we give out for free about ourselves,” Fletcher said. “Privacy is going to be a very interesting conversation over the next several decades, as technology creates the ability for us to collapse privacy entirely, and let corporations and governments know everything about you.”
The ordinance likely will not be passed until January, though Fletcher is currently going through a second round of community engagement with the ordinance while drafting it.
According to a report by Electronic Frontier Foundation, a leading California-based data privacy nonprofit, facial recognition technology is meant to do three things: identify an unknown person, verify the identity of a known person and find specific people in public places who are already wanted by law enforcement.
However, this technology has come under significant fire for its inaccuracy over the years, especially when it comes to identifying people of color, women and children.
Emun Solomon, a University of Minnesota alum who works as a data scientist and product manager at Snapchat, attended a virtual town hall last month alongside Fletcher, community members and the national ACLU, where the groups discussed the damage this kind of technology has on a community.
“The technology is racist on two levels,” Solomon said. “It’s about an algorithm, an automated math problem, made by a skewed group of engineers.” And it is that first problem, Solomon said, that contributes to the second — technology does not represent or account accurately for the actual population.
A 2018 study by Massachusetts Institute of Technology and Microsoft found that darker-skinned BIPOC women are nearly 35% more likely to be misidentified by facial recognition technology than lighter-skinned white men.
Munira Mohamed, a policy associate at the ACLU Minnesota, said one of the key issues with the technology is that it appears to only benefit one part of the population.
“It begins with the people making this technology … which is disproportionately white men. So they’ve made a technology that works for them,” Mohamed said. “And disregards communities of color.”
According to Solomon, many of these facial recognition softwares are trained with mugshots of homeless individuals who are paid to participate in trainings, which could increase the technology’s error rate and feed stereotypes.
The Hennepin County Sheriff’s Department is still authorized to use facial recognition technology, a fact made public by local journalist and data privacy advocate Tony Webster in 2016.
Fletcher also introduced data privacy principles before the city council in February and sees this ban as an extension of that. While Fletcher added that facial recognition technology is all around us, like in our smartphones and other security systems, this ordinance would ban city government from its use — including the Minneapolis Police Department.
His next step is to extend the ban to things like security cameras in stadiums and other large public spaces.
“This has a lot of potential to really get really invasive in people’s lives and track too much data that we’re not consenting to,” Fletcher said.
Alex Marthews
Jan 12, 2021 at 3:35 pm
They’re more likely to be subject to mistaken identity.
But this article only gets at half of the problem. An inaccurate facial recognition system can be biased by race and sex, and that’s bad. But a fully accurate facial recognition system would be terrifying. It would allow the government to reliably track everyone wherever they go and whatever they do.
CapnRusty
Nov 24, 2020 at 12:22 pm
So, then, all black people look alike to the technology?