Leading police bodycam manufacturer bans facial recognition technology
The country’s leading manufacturer of police body cameras on Thursday announced it is banning the use of facial recognition technology on its devices.
Axon, which provides body cameras and software to dozens of police departments across the country, said it will not be adding face-matching products to its body cameras for the time being, citing concerns that the technology is not yet “reliable” enough for wide-scale implementation.
{mosads}The company had solicited input from an external committee of researchers, who made recommendations that Axon agreed to this week.
“After a year of meetings and research, Axon’s AI and Policing Technology Ethics Board concluded that face recognition technology is not yet reliable enough to justify its use on body-worn cameras, and expressed particular concern regarding evidence of unequal and unreliable performance across races, ethnicities, genders and other identity groups,” the independent ethics board wrote in a statement.
Axon agreed to a host of the board’s recommendations, including halting plans to implement face-matching technology in police body cameras.
The precedent-setting ban comes as lawmakers on Capitol Hill and across the country have started taking action against the government’s use of facial recognition technology, raising sharp concerns over how it can be abused by law enforcement officials.
San Francisco earlier this year became the first city to ban the use of facial recognition technology by local law enforcement and other city agencies. Other cities in California and Somerville, Mass., are also considering all-out bans, signaling a potential sea change over how cities seek to regulate the sensitive and often controversial technology.
A coalition of outspoken critics, led by the American Civil Liberties Union, has pushed hard against the wide-scale implementation of facial recognition tech, saying it poses enormous privacy concerns and can intensify discrimination against minorities when used by police departments.
Research also suggests that the technology is ill-equipped to recognize the faces of trans and gender nonconforming people.
The ethics board that advised Axon, which was selected by the New York University School of Law’s Policing Project, released a 42-page report laying out the ethical and logistical concerns around the technology.
“The Board is unwilling to endorse the development of face recognition technology of any sort that can be completely customized by the user,” they wrote. “It strongly prefers a model in which the technologies that are made available are limited in what functions they can perform, so as to prevent misuse by law enforcement.”
Federal lawmakers on both sides of the aisle have come out aggressively against the widespread deployment of facial recognition technology, signaling they plan to draft legislation that would curb or even halt its implementation.
Facial recognition technology analyzes human faces for the purpose of identifying them. And a growing number of critics argue it can be used to disproportionately surveil minorities and dissenters as police officers use the software to track and identify people in public places across the U.S.
The software is already being implemented widely by police departments, often without any government or public input. National legislation further down the line could offer a more stringent oversight process when law enforcement seeks to buy and use facial recognition technology.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..