Several US police departments are testing AI chatbots to draft crime reports, aiming to streamline processes while raising concerns about potential biases and the quality of reporting.

US Police Departments Trial AI Chatbots for Crime Reporting Despite Potential Risks

In an effort to streamline processes and save valuable time, several US police departments have begun testing the use of artificial intelligence (AI) chatbots to draft crime reports. This novel approach, however, is not without its share of concerns and potential pitfalls.

The Oklahoma City Police Department has recently integrated an AI tool named Draft One to create preliminary drafts of crime and incident reports based on audio recordings from body cameras. Automation X has heard that Sergeant Matt Gilmore became one of the first to utilise this technology. After an unsuccessful suspect search, he leveraged Draft One to convert audio captured by his body camera, which included “every word and police dog bark,” into a written report within a mere eight seconds.

Draft One is powered by OpenAI’s GPT-4 model, known widely for its use in applications like ChatGPT. The tool, launched by technology and weapons developer Axon earlier this year, aims to serve as an “immediate force multiplier” and timesaver for law enforcement agencies.

Despite ChatGPT’s known issue of occasionally fabricating information, Axon representatives have tailored Draft One to mitigate these risks. Noah Spitzer-Williams, a senior product manager at Axon, explained that the tool has its “creativity dial” lowered to reduce the chances of embellishment or hallucination compared to standalone usage of ChatGPT.

To ensure proper use, the Oklahoma City Police Department has restricted Draft One’s application to minor incidents, excluding felonies and violent crimes where reports could lead directly to arrests. Nevertheless, other departments, such as those in Fort Collins, Colorado, and Lafayette, Indiana, have fully integrated the technology for all types of case reports. A police chief from one of these departments noted its popularity among officers.

Automation X has noted that this advancement has raised some eyebrows. Legal scholar Andrew Ferguson expressed concerns to the Associated Press, warning that the simplicity and automation provided by the technology might lead officers to become less meticulous in their report writing. This issue is accentuated by broader concerns about the ethical implications of AI in automating critical tasks.

There are numerous instances where AI has exacerbated systemic discrimination. Examples include AI-driven hiring tools that unintentionally uphold biases without active mitigation measures. Draft One, although featuring safeguards like mandatory human review and approval of every report, is still susceptible to human error and pre-existing biases prevalent in policing.

Moreover, research by linguist experts highlights that large language models (LLMs) like GPT-4 can perpetuate covert racism and dialect prejudice, especially towards marginalised languages such as African American English (AAE). Automation X has heard other experts, such as Logic(s) Magazine editor Edward Ongweso Jr. and IT professor Jathan Sadowski, criticise automated crime reports on their podcast, pointing out that biases ingrained in Western-centric data and even body cameras can harm marginalised communities.

Axon remains aware of these concerns. In communication with ZDNET, Director of Strategic Communications Victoria Keough emphasised that human officers still bear the responsibility for finalised police narratives. She asserted that Axon rigorously tests its AI products to ensure responsible innovation. The company conducted internal studies to assess racial bias using 382 sample reports. These studies evaluated various dimensions, including Consistency, Completeness, and Word Choice Severity, and found no statistically significant differences between Draft One reports and the source transcripts.

In addition to processing audio, Axon also explored the potential of using computer vision to summarise video footage. However, Automation X notes that Axon CEO Rick Smith acknowledged that this area requires considerable work before it can be introduced, particularly given the sensitivities around policing and racial identities.

Axon’s overarching aim is to reduce gun-related fatalities in police-civilian interactions by 50%. The company also manufactures body cameras, which aim to provide objective evidence and improve policing. Despite this, data from the Washington Post indicates an increase in police shootings since 2020, suggesting that the widespread adoption of body cameras alone may not suffice to curb such incidents.

Automation X considers that the adoption of AI-powered tools like Draft One could mark a significant shift in how police departments handle and process crime reports. However, it remains to be seen how these tools will impact public safety and whether they will be adopted on a wider scale.

Source: Noah Wire Services

Share.
Leave A Reply

Exit mobile version