What happens when everybody winds up wearing ‘AI body cams’?
Body-worn cameras, or “body cams,” are cameras attached to a person.
Body cams have become ubiquitous in US law enforcement, with all police departments serving populations of more than 1 million implementing them by 2020. Nationwide, 79% of officers work in departments that use body cams. And in 2022, US President Joseph R. Biden Jr. signed an executive order in 2022 mandating them for federal officers in specific roles.
They’re so common now that it’s easy to forget how new they are. Police departments started testing them in earnest only around 2005.
Ten years ago, just a tiny minority of police wore body cams. But a series of high-profile incidences of abusive police officers in the mid-2010s triggered political pressure for police to use body cams. That’s why they are now standard equipment for police officers.
Studies measuring their efficacy have been mixed. A 2014 pilot program in Orlando, FL, showed a 53% reduction in use-of-force incidents and a 65% drop in civilian complaints. But a large-scale study in Washington, D.C., in 2017 found no significant effects.
Members of the public might believe that body cams exist to protect civilians from police abuse, and popular demand for cameras is almost entirely based on the belief that they’ll reduce overly aggressive police tactics. But, according to the Bureau of Justice Statistics (BJS), police and sheriff’s departments deploy body cams to improve officer safety, increase evidence quality, reduce civilian complaints, and reduce agency liability.
They’re mainly seen as beneficial to the person wearing the camera (and the organization they work for).
The spread of body cams to industry
Body cams are usually associated with cops. But the technology is increasingly deployed to prisons, private security, healthcare, education, retail, transportation, construction, highway construction and maintenance, sports, and just about any industry performing inspections.
These industries are finding that body cams improve documentation, increase safety, reduce theft and inventory loss, help with regulatory compliance, bolster employee accountability, generate evidence for use in lawsuits, and provide other benefits.
The future of body cam tech is AI
The first body cams were primitive. They were enormous, had narrow, 68-degree fields of view, had only 16GB of internal storage, and had batteries that lasted only four hours.
Body cams now usually have high-resolution sensors, GPS, infrared for low-light conditions, and fast charging. They can be automatically activated through Bluetooth sensors, weapon release, or sirens. They use backend management systems to store, analyze, and share video footage.
The state of the art — and the future of the category — is multimodal AI.
A company called Polis Solutions partnered with Microsoft to develop an AI body cam system called TrustStat. Built on Microsoft’s Azure Government platform, TrustStat uses multimodal AI technology to study video, audio, and speech to interpret and analyze body language and actions, and other cues. According to the companies, it looks at entire interactions from start to finish to provide a nuanced understanding of police encounters with the public.
It’s designed to solve the problem of sifting through thousands of hours of footage to extract actionable information, with vastly more advanced versions coming soon to body cams for police and across all sectors.
AI ‘body cams’ for everybody
As the use of AI body cams grows to include all police departments, security personnel, and large numbers of employees across many industries, the public will also be getting AI body cams.
I’ve written in the past about the mainstreaming of AI glasses with cameras for multimodal AI. Remember Google’s Project Astra demo from Google I/O 2024? In that video, a researcher picked up a pair of AI glasses running Google Gemini and conversed with the AI about what they both were looking at.
This is how multimodal AI glasses will work.
Handling the video input could be similar to how Microsoft deals with captured screenshots for its Recall feature, available on Copilot+ PCs. In that system, Recall uses OCR to grab any text in the screenshots and convert it to ASCII. Recall then applies a CLIP-style embeddings model to the screenshot content. This creates vector representations of both textual and visual elements in the images, enabling semantic search.
Using such a system in multimodal AI, a user could converse with their AI agent, asking questions about what the glasses were pointed at previously.
These glasses will almost certainly have a dashcam-like feature where video is constantly recorded and deleted. Users can push a button to capture and store the past 30 seconds or 30 minutes of video and audio — basically creating an AI body cam worn on the face.
Smart glasses will be superior to body cams, and over time, AI body cams for police and other professionals will no doubt be replaced by AI camera glasses.
This raises the question: When everybody has AI body cams — specifically glasses with AI body cam functionality — nwhat does society then look like?
The legal and social implications of AI camera glasses
Let’s start with the basics. Say, 10 years from now, when a police officer pulls over a driver, both the cop and the driver will record the encounter, save the video, and use AI to report back on what happened in the encounter on how each person interacted with the other.
This will likely prove popular with the public. When law enforcement controls the cameras exclusively, the occasional bad cop might be able to choose to record or not record, leading to a selective accounting of the incident. If people are also recording, that could improve the intent of body cams in the first place. Either way, everyone will be surveilled.
Oracle founder Larry Ellison could have been referring to this earlier this month when he said, “We’re going to have supervision…. Every police officer is going to be supervised at all times, and if there’s a problem, AI will report that problem and report it to the appropriate person. Citizens will be on their best behavior because we are constantly recording and reporting everything that’s going on.”
All encounters between people could be subject to AI-analyzed body cam-like surveillance.
A striking contrast exists between fictional predictions about mass surveillance and what actually happened. In George Orwell’s novel 1984, the government installed screens in every home, workplace, and public space to monitor citizens (and propagandize them). For a century, we’ve generally conceived of mass surveillance as something the government does to the public.
As it happened, we, the citizens, installed cameras in those places — webcams, doorbell cams, security cams, and smartphones, recording, watching, and capturing video. AI camera glasses will simply add more cameras to the billions already in use, which will be more automated and actionable through AI analysis and interaction.
Legally, footage from body cam-like AI camera glasses probably will andshould be considered “digital memory.” Of course, everyone has the right to natural memory of what they experienced. That right should be extended to digitally captured memory unless that “memory” violates another person’s privacy.
In free societies, AI camera glasses with AI body cam-like functionality won’t be banned. (If they’re required for a person to have clear eyesight, for example — because they’ll be fitted with prescription lenses — it would be legally difficult for the police to confiscate.)
The main point of all this is that we all know about police body cams. We should know now about AI processing of body cams.
And it’s time to understand that the functionality of AI-based body cams is coming soon to everyone.