
The Algorithm That Scaled The Harm
The humanitarian crisis in insurance claims management started in Australia — New South Wales, to be precise.
The Algorithm That Couldn’t Feel
Nearly a century after Premier Jack Lang founded the Government Insurance Office (GIO) in 1926 to curb profiteering by private insurers, a quiet revolution began inside its walls.
In the late 1980s—long before the world had heard of ChatGPT—GIO’s programmers and actuaries developed one of the earliest algorithmic systems for managing insurance claims. It was called Colossus and it was designed to standardise payouts and stop what the industry called claims leakage. It promised fairness through consistency.
Instead, it replaced clinical judgment with code.
As recorded by GIO historian Jane R. Nauta in the NSW Parliamentary Library archives, this was the turning point—when human discretion gave way to actuarial logic. What began as a public innovation was later exported globally and adopted by major insurers as a cost-containment tool.
When psychological injuries began to rise, the programming didn’t evolve.
It was built on bones, not minds—trained on fractures, not trauma.
The algorithm couldn’t read grief, moral trauma, or PTSD.
Through the voices of lived experience around Australia and internationally, we began to see what the data never could. People warned the system. Clinicians, advocates, and the injured themselves spoke out. But the insurers didn’t listen. They trusted the data, not the damage.
The result has been the quiet destruction of lives—proof that when a system is built to manage cost but not compassion, it will inevitably harm the people it was meant to protect.

Delay, Deny, Defend
Law professor Jay M. Feinman, in his book Delay, Deny, Defend, reveals how this logic had become embedded in global insurance practice. Algorithms trained to detect “leakage” learned to defend profits, not people, scaling the harm — turning delay, denial, and defence into a default operating model.
Systems built to ensure fairness became machines of resistance, quietly shaping every interaction between the injured and those tasked with their recovery.
Exported to the World
Colossus didn’t stay in Australia. By the 1990s, its underlying logic had been exported and commercialised, forming the blueprint for automated claims management worldwide. Today, its digital descendants drive much of the insurance sector’s back-end processing, determining how injury, illness, and liability are managed at scale.
Platforms like Guidewire — now the industry’s dominant global infrastructure — promise efficiency and standardisation. Yet their true power lies in how they’re programmed, and by whom. That configuration isn’t done by the software vendor; it’s done by the customer — the insurer — whose own analysts and contractors define the business rules, data fields, triage models, and automation pathways that decide outcomes. When those parameters favour cost control over care, the results cascade through the system. Algorithms can misclassify injuries, automate denials, or trigger underpayments. They can distort how business premiums are calculated, how provider invoices are processed, and how claims performance is reported to regulators.
It’s not the technology that discriminates — it’s the data, the directives, and the institutional mindset that built it. And in an industry grounded in actuarial data, those numbers are tied to life expectancy, return-to-work probability, and projected cost.
A lifelong injury represents years of payments; a death is a single payout.
The mathematics is coldly efficient — a life ended costs less than a life sustained. This is the hidden arithmetic of algorithmic care: when compassion is replaced by code, and code is written to serve the balance sheet.


Understanding the “Insurance Float”?
When an insurer collects a premium, in this case compulsory premiums paid by employers to the government, it receives the money long before it must pay a claim — sometimes months, sometimes years later. That pool of unpaid claims money is called the float. While it sits on the books, it’s invested — earning returns for the insurer, or the government, not the injured person.
In theory, it’s just good capital management. Hence the involvement of Treasury in overseeing workers compensation.
In practice, it creates a powerful incentive to delay, deny, or minimise payouts — because every extra day a claim remains unsettled keeps the float earning profit.
It is a finance system pretending to be health
What Jack Lang once designed to stop profiteering in 1926 has, a century later, evolved into a financial machine where human suffering itself generates yield.
The 2020 icare Scandal Impact Continues to be Felt
Fast-forward to New South Wales. In 2020, the state-run insurer icare faced a storm of public scrutiny. Whistleblowers exposed under-payments, mis-managed claims, and millions spent on untested IT systems. Yet publicly examining the human impact of the IT scandal got lost in the political landscape. So here we are again....
At the heart of the scandal was a new model designed with advice from McKinsey & Company in 2017 — recommending the “streamlining and automation of claims management through a single IT platform.”
That platform — Guidewire, implemented as the Nominal Insurer Single Platform (NISP) — was programmed to triage claims automatically, sorting workers by algorithmic severity ratings and predefined recovery paths.
Independent reviews later found inaccuracies in those triage outcomes, governance failures, and significant declines in return-to-work results.
By 2025, the platform was still under remediation. Hundreds of millions had been spent, but the core question remained unanswered:
Can a system built to save money ever be trusted to deliver care? And why has it taken until 2025 for the regulator, SIRA to call for a cultural shift in claims management. The scale of this harm is staggering.


Not have a case worker? Here's why
Every algorithm begins with a human decision.
Every dataset reflects human bias.
And every system — no matter how advanced — mirrors the priorities of its makers.
The tragedy of workers’ compensation in the digital age is not that machines took over — it’s that humans programmed them to think like accountants, not carers.
References
NSW Parliamentary Library Research Service, History of the Government Insurance Office (GIO) — Jane R. Nauta (1999)
Jay M. Feinman, Delay, Deny, Defend: Why Insurance Companies Don’t Pay Claims and What You Can Do About It (2010)
McKinsey & Company, icare Claims Operating Model: Design and Transition Recommendations (2017), as cited in the NSW State Insurance Regulatory Authority, Independent Review of icare Claims Management Model (2025)
