Justice report

How AI could change the face of justice

Justice services are under huge pressure. I have worked across the sector and know the pressure felt by victims waiting for their day in court, by practitioners carrying heavy caseloads, by people trapped in cycles of reoffending, and by communities that need both safety and legitimacy, writes Russell Goodenough, Senior Vice President for Consulting Services and Head of Artificial Intelligence at CGI UK and Australia.

When delays stack up, the system does not just slow down, it becomes unfair.

I help organisations adopt AI responsibly, including CGI’s clients across criminal and civil justice, policing, prisons, probation and more widely. I am optimistic about where AI can help now, but clear about where it must be handled with care.

Timeliness is justice

A victim checks their phone again. No update. Months have passed since the incident, the statement, and the first promise that “the court date will come through soon”. Life moves on, but the case does not.

No one is choosing to be slow, and yet the case is travelling through a maze: documents requested, redacted, re-uploaded and re-checked; hearings listed, relisted and adjourned; digital evidence queued; teams chasing the same missing information in different formats. These are the right places to start with AI and technology can quickly make a measurable difference.

I have seen the cost of fragmentation and feel driven to help. In the hours before hearings involving child abuse, I have watched professionals in the England and Wales Criminal Justice System rekey essential information across systems under crushing pressure because systems do not integrate. I have also seen judges adjourn cases so prosecution and defence can agree basic historical facts about a defendant, simply because different case management systems do not match. These are tasks where AI can have a very human impact.

Practical opportunities across justice

Justice is a system of systems. Improvements in one part ripple through the rest: policing affects prosecutions, courts affect prisons, and prisons affect communities. The near-term opportunities might actually be unglamorous but very real.

In policing and prosecution, we can see AI improving case file quality, speeding up case preparation, and handling digital evidence more effectively. In courts, it can mean preparing bundles, scheduling and listing, and routine applications. Across each area, the immediate sweet spot is better information handling: completeness checks, version control, and summaries with citations back to source material so professionals can verify quickly. Used well, AI supports preparation and reduces rework. It does not replace professional judgement.

Prisons and probation can benefit too, by reducing paperwork that crowds out rehabilitation work and by supporting education pathways and resettlement planning. Better outcomes here mean less reoffending and safer communities.

Civil justice matters as well. AI could widen power imbalances if it becomes a private advantage for those who can afford the best tools, but narrow them if it improves access, clarity and speed.

Where to start, and what not to rush into

A common mistake is jumping straight to the hardest challenges. When we discuss AI in justice, we often leap to predicting risk, recommending outcomes, or automating suspicion. Those debates matter, but they can distract from immediate value in high-volume admin and information work that consumes time without improving professional judgement.

That is not to say that complex use cases are not important. On the contrary, we are making great strides with complex topics such as early neutral evaluation for judicial decision-making, for example. Rather, it is that we have learned where to start to deliver successful change projects.

We start with clear boundaries: AI should assist, not decide, and humans remain accountable. Build assurance in from day one, including robust logging, an audit trail, human review checkpoints, and a named accountable owner. Focus on two or three low-risk workflows at a time, measure what matters, and scale only when the evidence is clear.

We can deliver major improvements in throughput and productivity without taking on the legitimacy risk of algorithmic risk scoring on day one. That does not mean never. It means not without robust governance, clear legal basis, evidence standards, and transparent accountability.

Synthetic evidence and trust

Synthetic evidence and disinformation is a prescient concern. The risk is not only that an AI-generated image, audio clip or document might slip through undetected, it is that people can be exposed to material designed to mislead and influence, even if it is later identified and countered.

The answer is not a single detection tool. It is defence in depth: stronger provenance and chain of custody for digital artefacts, tamper-evident handling, clear standards for what “trusted” evidence looks like, and a culture of verification before reliance. We also need to raise AI literacy across justice professionals and, increasingly, across society.

Public trust in justice is earned slowly and can be lost in a day. It will grow through practical success: fewer delays, clearer updates, and better outcomes for victims. But it could be destroyed by a single ill-advised decision where AI is perceived to have replaced human judgement, or where accountability is unclear. A minimum standard is straightforward: human accountability stays intact; start low-risk and high-volume; fairness is engineered and monitored; traceability and audit are mandatory; and challenge and redress are designed in.

A constructive invitation

If you work in the justice sector, you do not need another abstract debate about whether AI is good or bad. You need practical change that improves timeliness, protects legitimacy, and supports professionals to do the right thing.

If you are dealing with backlogs, disclosure friction, fragmented case information, or administrative overload, CGI is working hard to help you deliver value without compromising accountability.

For further information on CGI’s work in Northern Ireland visit:

W: www.cgi.com/uk/en-gb/cgi-in-northern-ireland

Show More
Back to top button