A Utah police department’s experiment with artificial intelligence took an unexpected turn after a software-generated report claimed an officer had transformed into a frog.
The incident occurred earlier this month in Heber City, where police have been testing AI tools designed to write reports based on body camera footage.
According to a KTSU report, the bizarre claim was not the result of science fiction or misconduct, but a simple background error.
“The body cam software and the AI report writing software picked up on the movie that was playing in the background, which happened to be ‘The Princess and the Frog,’” Sgt. Rick Keel told KTSU.
“That’s when we learned the importance of correcting these AI-generated reports,” Keel added.
The department recently began testing two AI systems to streamline paperwork for officers.
The software generates police reports directly from body camera footage, with the goal of reducing the workload for busy officers.
To demonstrate how the AI went haywire, KTSU observed Keel during a staged traffic stop.
“Hi, I’m Rick with the Heber PD. The reason I’m stopping you today is for…” Keel said during the demonstration.
Afterward, the AI produced a report complete with timestamps from the interaction.
The software is capable of operating in both English and Spanish, and can even track tone and sentiment during conversations, per the report.
Keel said the main appeal of the technology is how much time it saves his employees.
Police reports typically take between one and two hours to complete.
“I’m saving myself about 6-8 hours weekly now,” Keel said.
“I’m not the most tech-savvy person, so it’s very user-friendly,” he added.
The software the department was using is called Code Four, and it costs roughly $30 per officer per month.
Keel said the trial period for Code Four ends next month.
Department officials have indicated they plan to continue using AI tools, though they are still deciding which system to adopt.
The Utah incident is not the only recent example of AI getting important details wrong.
In Canada, musician Ashley MacIsaac was incorrectly labeled a sex offender by Google’s AI-generated search summary.
According to the Canadian newspaper The Globe and Mail, the error led to the cancellation of a scheduled major performance.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.














