When it comes to increasing police accountability and transparency it’s policy, not technology, that does the heavy lifting. Police body cameras, tools that are overwhelmingly popular among the public, are sometimes cited as a valuable resource for addressing police misconduct and secrecy. They can be, but only if the right policies are in place. Absent policies that balance privacy interests with the need to increase police accountability, body cameras are surveillance tools. The risk of body camera surveillance is especially pronounced at a time when a major body camera manufacturer is doing more work on artificial intelligence, a development that may result in the widespread use of police body cameras with real‐time facial recognition capability.
Axon, the company that makes one of the most popular police body cameras, released a Law Enforcement Technology Report last year. That report outlined some of the technology that’s on the horizon: “Soon, you’ll be able to tell almost immediately if someone has an outstanding warrant against them, thanks to facial recognition technology.”
According to reporting by The Wall Street Journal, the merger of body camera and facial recognition technology is months rather than years away.
I’ve written on this blog before about why body cameras with facial recognition capability are a threat to civil liberties. I’m hardly alone in highlighting this threat. Axon’s leadership is clearly aware of the concerns raised by civil libertarians and has convened an AI Ethics Board. Yet it seems as if this board will have little if any impact on Axon’s development of technology that poses a significant risk to civil liberties.
An “Ethics Board” sounds like the kind of body a company that builds surveillance equipment and weapons should have. However, Axon’s AI Ethics Board lacks any kind of authority to ensure that the company’s products aren’t used unethically.
Yesterday, a coalition of civil rights groups wrote a letter to the Axon AI Ethics Board outlining their well‐founded concerns. The letter calls for board members to assert themselves and oppose real‐time facial recognition on body cameras, consult with community members with direct experience with the criminal justice system, limit sales to law enforcement agencies with appropriate body camera policies, and ensure that they have an oversight remit that covers all of Axon’s digital products.
Members of the AI Ethics Board, which includes eight volunteer civil liberties, AI, and criminal justice experts, do not currently have the authority to veto Axon products. A functional ethics board should be free to halt products or at the very least publish reviews of all Axon devices.
If Axon’s ethics board guaranteed that only departments with policies that increase accountability and transparency while also protecting civil liberties could buy Axon products the company would sell fewer body cameras. Dozens of America’s largest and most prominent police departments fail to implement praiseworthy body camera policies. For example, an Upturn examination of 75 police department body camera policies found that the Baltimore Police Department is the only department with strict limits on body camera footage being analyzed with facial recognition software, and that not a single department requires officers to write a report before reviewing body camera footage related to any incident. Giving the AI Ethics Board the power to dramatically affect sales is one of the reasons that Axon is unlikely to adhere to the recommendations in the recent coalition letter.
In all likelihood, Axon will continue to sell products that can, if governed by poor policies, erode civil liberties. Although Axon is signaling that it’s concerned about the ethical implications of its products, it doesn’t look as if its ethics board will prevent the proliferation of body cameras that will become known as tools of surveillance, not police accountability. In order for body cameras to achieve their potential as tools that improve policing it’s policymakers rather than private companies who will have to implement necessary changes.