
The System That Says It Doesn’t Use AI - Part 1.
Nov 2
3 min read
1
8
0
Ten years of icare. Zero scrutiny of the code deciding people’s lives.

Automation without oversight is not efficiency — it is risk at scale.
When a government agency claims it does not use artificial intelligence in workers’ compensation decisions — while its own industry and technology partners publicly describe the same systems under different names the issue is no longer whether the public is being misled.
The issue is whether the State even understands the algorithms or technology it uses.
A Century of Automation in Disguise
From Bismarck’s 1884 no-fault recovery promise to GIO’s Colossus expert system in the late 1980s, Australia has been turning human judgment about injury into code for more than 40 years.
By 1989, GIO’s Colossus system was handling most personal-injury claims. Its stated goal: consistency. Its true impact: the digitisation of discretion.
When ownership moved offshore, the program spread globally. U.S. regulators later found insurers had used Colossus-type systems to undervalue claims, resulting in multi-million-dollar settlements.
Even the Australian technicians who helped build Colossus later admitted errors in its early design and created a more transparent successor which never scaled.
The logic survived: quantify pain, automate fairness.
NSW’s Fragmented Present
Function | Agency | Technology Status |
Insurance & claims | icare | Runs on Guidewire. Platform approved in 2015. |
Regulation & data | SIRA | Mid-stream “digital uplift” — no verified recovery baseline. |
Workplace safety | SafeWork NSW | NSW Audit Office confirmed 20-year-old core platform still in use (Feb 2024). |
Three agencies. Three platforms. No single picture of recovery.
This fragmentation is not accidental. It is architectural.
A System Born in Technical Disconnection
The core platform now used by icare was scoped before icare even existed under the former WorkCover NSW, when insurance, regulation, and safety were still one body.
When WorkCover was split into icare, SIRA and SafeWork NSW in 2015, the IT architecture was not redesigned to support three-way data sharing or whole-of-system tracking.
icare got the new claims platform
SIRA inherited partial datasets
SafeWork was left with legacy systems
no shared data spine was ever built
What NSW built is not a workers’ compensation system. It is three disconnected machines pretending to be one.
This is now on public record: a 2021 probity review found the original platform procurement “materially non-competitive” and “fundamentally compromised”. Source: iTnews, 4 May 2021
“Ten years after icare was created, there has still been no independent audit of the automation deciding who gets care — and who gets denied.”
“No AI Here” — But Automation Everywhere
In April 2025, icare stated:
“icare does not utilise AI in any workers compensation decision-making.”
That is technically true only under a narrow definition of AI one limited to self-learning models.
The insurance industry itself confirms the widespread use of:
automated triage engines
rule-based decision workflows
auto-generated claim letters
predictive claims scoring
Source: Insurance Council of Australia & CSIRO (2024)
Whether labelled AI or workflow logic, the effect is the same: software is shaping medical access, income rights, and human outcomes.
The Lived Consequence
Injured workers describe patterns that align with automation, not human judgment:
rejection letters issued within minutes
duplicate document requests
unexplained treatment freezes
no named decision-maker
These aren’t caseworker quirks. They are the fingerprints of code.
Why This Matters
Tens of thousands rely on this system every year.
Yet:
automation has never been independently audited
recovery outcomes are not verified
no one can say where human discretion stops and software begins
A safety-net without oversight is not a system. It is a live experiment and the injured are the test subjects.
The Questions NSW Can No Longer Avoid
Who — in law — is the decision-maker: a human officer or a workflow rule?
How often are automated outputs overridden and by whom?
Has automation reduced or delayed entitlements? No one has checked.
Why deny AI while industry reports confirm equivalent systems?
Where is the legal consent from people whose claims pass through software engines?
Until these answers exist, the system cannot claim legitimacy only authority.
The Governance Truth in One Line
“From Bismarck’s 1884 no-fault promise to GIO’s 1980s automation to 2025’s ‘no AI here’ denials — every reform has prioritised efficiency over recovery.”
Automation without transparency doesn’t modernise a system. It industrialises harm.
Sources (all publicly available)
NSW Auditor-General (2024) — Workers Compensation Claims Management
https://www.audit.nsw.gov.au/our-work/reports/workers-compensation-claims-management
Insurance Council of Australia & CSIRO (2024) — AI for Better Insurance
https://insurancecouncil.com.au/resource/ai-for-better-insurance-report/
NSW Audit Office (27 Feb 2024) — SafeWork NSW Performance Audit
https://www.audit.nsw.gov.au/our-work/reports/safework-nsw-performance-audit
iTnews (4 May 2021) — NSW state insurer slammed for “sloppy” Capgemini deal
https://www.itnews.com.au/news/nsw-state-insurer-slammed-for-sloppy-capgemini-deal-564138
icare statement to stakeholder (April 2025): “icare does not utilise AI in workers compensation decision-making.” (copy on file)
(No allegation of individual misconduct is made. All claims are based on public reports, audits or published correspondence.)






