³Ô¹ÏÍøÕ¾

Australian police are trialling AI to analyse body-worn camera footage, despite overseas failures and expert criticism

Police departments around the world are increasingly using body-worn cameras in an attempt to improve public trust and accountability. But this has created huge amounts of data, of which is never reviewed or even seen.

Authors

  • Kathryn Henne

    Professor and Director, School of Regulation and Global Governance, Australian ³Ô¹ÏÍøÕ¾ University

  • Charles Orgill Gretton

    Associate Professor and TechLauncher Program Convener, Australian ³Ô¹ÏÍøÕ¾ University

  • Kanika Samuels-Wortley

    Associate Professor, Faculty of Social Science and Humanities, Ontario Tech University

Enter companies such as , and . These companies market artificial intelligence (AI) tools for analysing the data generated by body-worn cameras and other policing technologies.

Some police departments in the United States of these tools before abandoning them because of concerns about privacy.

Truleo told The Conversation that police in Australia were now using its technology, but did not name any specific department. However, when The Conversation asked Australian police departments if they were using or considering using Truleo’s software, all except the Queensland Police Service said they were not.

In a statement, a Queensland Police Service spokesperson said it is currently conducting an AI trial with “a variety of technology” as part of its work tackling domestic and family violence. The spokesperson added: “Once the trial is completed, a detailed evaluation will be undertaken before the QPS considers future options for using the technology”.

But AI will not solve the challenges facing police – at least, not by itself.

The unfulfilled promise of body-worn cameras

The increased use of body-worn cameras by law enforcement agencies in recent years follows a number of high-profile cases involving police using force. In Australia, for example, a police officer is for the manslaughter of a 95-year-old great-grandmother by using a taser.

There is debate about whether body-worn cameras actually make police officers’ behaviour more transparent and accountable.

have said their effectiveness is uncertain. Others have said they are a .

These sentiments were echoed by a .

The study examined the use of body-worn cameras in response to domestic and family violence in Australia. It acknowledged their potential utility but showed how data from these technologies might . This is because of more foundational problems with how police engage with victim-survivors.

AI’s many uses in policing

Police have been using AI as part of their work for a long time.

For example, in 2000, New South Wales Police launched a program that used data analytics to predict which people were at risk of committing a crime, to enhance police supervision.

A from the Law Enforcement Conduct Commission later revealed the program disproportionately targeted Indigenous youth, who subsequently faced heightened surveillance and increased arrests for minor crimes. This led to NSW Police ending the program in 2023.

The Queensland Police Service has also proposed a program using AI technologies to predict risk of domestic and family violence.

Experts, however, have pointed to , including criminalising victim-survivors.

Companies such as Truleo, which provides police with AI tools to analyse body-worn camera footage, . However, it is not clear if what is being measured and assessed as “professionalism” correlates with officers’ core duties and responsibilities.

In fact, the Seattle Police Department in the US ended its contract with Truleo despite acknowledging it was a “promising” trial.

It did so after finding in which the police union cited the use of camera footage as infringing on the police officer’s privacy.

The need for structural reform

AI tools could help police manage and analyse body-worn camera data. Their value depends on several conditions.

First, police must thoroughly evaluate any AI tools to ensure they are fit for purpose in a local context. Many of these technologies are developed overseas and trained on data that have linguistic features such as accents, inflections and insults that are not common in Australia.

Second, police – and the companies that offer AI data analysis tools – must also be transparent about how they use body worn camera footage. In particular, they must share where, how and by what arrangements data is processed and stored.

Finally – and most importantly – the use of AI technologies by police should not supersede organisational and structural reforms.

Police need to examine the impact of behaviours and processes that have resulted in inequitable surveillance practices. AI technologies are not solutions to these underlying dynamics.

Without an understanding of the systemic structures that sustain disparities in the criminal legal system, police will not be prepared to address the implications of integrating AI technologies into their work. If they do not, these technologies are more likely to exacerbate existing injustices and inequalities.

In short, the questions about AI shouldn’t be simply about technology but about police legitimacy.

The Conversation

/Courtesy of The Conversation. View in full .