Colorado police are already using AI — but experts say the tech still has red flags

Artificial intelligence has arrived at many of Colorado’s law enforcement agencies, and police officials say the technology is helping officers spend less time behind computer screens and more time out in their communities.

But exactly how AI is being used by law enforcement and its ripple effects are not always clear, and experts warn that it comes with red flags around privacy and bias.

The AI that most people think about — deepfake videos of politicians and celebrities, or The Beatles releasing a new song with John Lennon’s vocals 43 years after his death — is not the kind of artificial intelligence being used by officers patrolling Colorado’s streets and neighborhoods.

But it’s a powerful tool, just the same. Officers are using AI to turn body-camera recordings of their interactions with the public into official reports, scan people’s faces and track license plates.

“AI is a huge deal,” said Anthony Tassone, chief executive of Truleo, an AI platform for law enforcement that acts as an assistant to help officers write reports; answer their questions about department policies and laws; and transcribe witness interviews.

Truleo has contracts with more than 20 police agencies in Colorado, but Tassone declined to identify those departments, citing client confidentiality.

“Every single cop (at those departments) is getting an AI assistant. It’s just like getting a gun, a camera or a car — it’s going to be standard issue,” he said.

While Tassone is a proponent of how AI can help police become more efficient and better serve their communities, he’s also concerned by a lack of accountability with how some AI platforms collect and process people’s private data — which Truleo does not do.

AI in policing is also worrisome for civil liberties advocates like the American Civil Liberties Union, which says the technology has the potential to amplify bias and introduce false information into the justice system.

“Law enforcement agencies are using these technologies and have been for quite some time, and it’s really hard to put the genie back in the bottle,” said Anaya Robinson, public policy director with the ACLU of Colorado. “But that doesn’t mean we can’t create meaningful regulations and come to a decision as a community if we want law enforcement to use AI, how we want it to be used and what boundaries and parameters to put around it.”

Who’s using AI?

The Denver Post reached out to more than 100 Colorado police departments and county sheriff’s offices to ask whether their officers and deputies were using artificial intelligence programs like Draft One, created by body-worn camera manufacturer Axon, or Truleo.

Of the 59 agencies that responded, 24 said their departments are using AI programs — most commonly Draft One, which uses artificial intelligence to convert body camera footage into written police reports.

Denver and other metro cities also use AI-powered license plate readers from the company Flock, despite concerns the system creates a mass surveillance network that can be exploited.

In Aurora, the City Council in October gave the green light for the Aurora Police Department to use an AI facial recognition program that city officials say “uses biometric algorithms within software applications to examine and compare distinguishing features of a human face,” according to a summary posted on Aurora’s website.

The technology helps officers find and prevent crime and identify people who refuse to tell police who they are, Aurora officials wrote in the summary.

Critics, including the U.S. Commission on Civil Rights, say it gives incorrect identifications more for scans of Black people, people of East Asian descent, women and older adults.

“The ability for surveillance to occur on a grand scale through the use of (facial recognition technology) prompts legal questions regarding the boundaries between permissible and non-permissible collection of data,” the commission wrote in a 2024 report.

An automatic license plate reader is pictured Wednesday, June 25, 2025, as drivers travel past it on Eisenhower Boulevard near Boise Avenue in Loveland, Colo. (Photo by Jenny Sparks/Loveland Reporter-Herald)
An automatic license plate reader is pictured Wednesday, June 25, 2025, as drivers travel past it on Eisenhower Boulevard near Boise Avenue in Loveland, Colo. (Photo by Jenny Sparks/Loveland Reporter-Herald)

Two of Colorado’s largest law enforcement agencies, the Denver and Aurora police departments, declined interviews with The Post about how their departments are using AI.

In a statement, the Denver Police Department said the agency is not using any AI product, and any future use would have to be reliable and provide a cost or efficiency benefit.

“DPD is currently exploring Large Language Model (LLM) artificial intelligence technology to evaluate the potential benefits,” DPD officials said. “This technology is designed to help write reports quicker.”

Large language models — like ChatGPT — are designed to understand and generate text like a human would, and are trained on books, websites and articles, according to Stanford University.

Denver police officials have been in contact with software vendors but haven’t made a decision, which ultimately will go through a procurement process with the city’s technology services team.

The Aurora Police Department is also exploring AI use for body-worn cameras and report-writing, but has not entered into any contracts, spokesperson Joe Moylan said. The department did not respond to an interview request about the face-scanning technology.

Aurora police previously used Truleo to review body-worn camera footage, which resulted in a 57% drop in officers using profanity, insults, threats or inappropriate language. But city officials did not renew the department’s contract after the first year, letting it expire in March 2024.

Arvada, Boulder police see time savings, less bias with AI

The Boulder Police Department rolled out a pilot program to use AI for writing police reports in 2024 and saw such success with it that the technology was approved for agency-wide use in 2025, Chief Stephen Redfearn said.

Boulder police use Draft One by Axon, the same company that makes the body-worn cameras that record officers interacting with the public.

Draft One takes the audio recording of officers’ interactions and transcribes it into a report that the officer can edit, review and then file. Redfearn estimates it saves about 30 minutes per report, which adds up fast.

“We’ve had our eyes wide open that it’s a new thing that could come with challenges, but as of now, we have not run into any challenges, and we think it’s a good tool to keep our officers on the street,” he said.

Redfearn sees AI as a way to overcome implicit biases. For example if an officer has responded to calls at someone’s home multiple times and the person is always drunk, they might write the report differently, with a bias, compared to someone approaching the situation with fresh eyes. AI can be those fresh eyes, Redfearn said.

“We’re typically finding a much more accurate, verbatim documentation of what people are telling officers on calls for service,” he said.

After reviewing the AI report, the officer checks a box that says it’s a fair and accurate representation of what happened, and a sentence noting AI was used to generate the report is added. Once a month, supervisors pull a handful of videos and compare them to AI reports to make sure they’re accurate.

Boulder Police officer Jarrett Mastriona uses artificial intelligence to generate a police report at the police station in Boulder, Colorado, on Thursday, May 22, 2025. (Photo by Hyoung Chang/The Denver Post)
Boulder Police officer Jarrett Mastriona uses artificial intelligence to generate a police report at the police station in Boulder, Colorado, on Thursday, May 22, 2025. (Photo by Hyoung Chang/The Denver Post)

A few miles south, Detective Daniel Sauter was one of the first people to test-drive AI at the Arvada Police Department. He regularly used Draft One to write simple reports until a recent move from patrol officer to crime scene investigator. He estimates the program helped him work about 25% faster.

There are limitations to the technology, Sauter said. Draft One isn’t helpful for more complex reports because it can only generate reports up to 800 words.. Draft One relies on body-worn camera recordings, so if the recording doesn’t pick up a conversation, neither will the AI program. But Sauter still “wholeheartedly” recommended the Arvada police adopt the technology.

“The less time we can spend in a station, writing a report equals more time on the street, more timeinterrupting crime, more time conducting traffic stops, more time responding to calls for service. It makes officers more efficient,” he said.

‘Relatively untested’

Despite the time savings, some of the benefits police agencies see in AI are the very things that worry experts and critics like Tassone and the ACLU.

The latter has “a lot of concerns” about police departments using AI to write reports that will end up as evidence in court to determine whether someone is guilty, Robinson said.

“I can understand why they would. It lowers workload and administrative duties so they can spend more time actually policing, but it’s also a relatively untested area for accuracy and using legal language,” he said.

AI experts have also raised concerns about learning models becoming biased because they are trained by people who have biases.

“Across the board, there’s a lot of concerns about the reliability of the technology and about bias,” Robinson said. “It’s only able to create output and make decisions based on the data that’s sent to it and the data that’s used to teach AI how to function. Because we live in a society full of bias, a lot of the data used to teach AI is also biased.”

For Tassone, Truleo’s CEO, the power of AI has the potential to improve the quality of police training across the country by giving officers easy access to laws and policies that guide their work.

But how people’s personal and private information is used and stored when it’s sent to AI companies used by police departments is a massive — and frightening — question mark, he said.

“To say AI is very, very powerful is a huge understatement,” Tassone said. “There are people who are not responsible, not principled and not thinking through how to protect community members with this technology and are rushing things out without thinking about the long-term impact.”

People’s names, addresses and Social Security numbers being accessed by AI engines — and the private companies that own them — is a huge red flag, he added. His AI program, Truleo, automatically redacts personally identifiable information.

“I’m surprised it’s taking so long for police departments to wake up,” Tassone said. “This data is going to get stolen.”

In a statement, Axon officials said Draft One cannot share any data for any purpose other than providing its service, a requirement of the federal criminal justice information services security policy.

Colorado officials have touted the state’s new AI regulations as being among the first in the U.S., but they lack any kind of comprehensive rules for how law enforcement uses the technology, Robinson of the ACLU said.

“In reality, our preference would be that most of those types of AI that law enforcement is using would not be used, and if it is — and because it is — we need tighter regulations, we need more transparency and we believe the community needs to have a say before the technology begins being used,” he said.

Sign up to get crime news sent straight to your inbox each day.

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *