AI rubbish: Consulter gives Australia money back

Automated decisions have harmed poor Australians. Deloitte was asked to audit the system and provided automated inventions.

listen Print view
Digital sign warns "GPT-4 Ahead"

(Image: Urban Images/Shutterstock.com)

2 min. read

Deloitte is refunding an Australian federal ministry part of the fee it received for an audit report. The reason is that the report apparently contained AI-hallucinated inventions. Deloitte now admits to having used a generative language model that created several false footnotes and invented non-existent sources. However, Deloitte maintains the fundamental statements of the audit report.

According to the Financial Times, the fabricated sources include non-existent academic studies that the AI attributed to researchers at Lund and Sydney universities. The LLM is said to be an instance of OpenAI's GPT 4o LLM provided by Microsoft, licensed by the Australian Department of Labour, and operated in an Azure instance.

The parties involved are not saying how much money Deloitte will refund from the original contract value of 439,000 Australian dollars (around 247,000 euros). In any case, they have reached an agreement. The Deloitte report published on the ministry's website was replaced by an adjusted version on 26 September.

The report passes a poor judgment on the IT system audited by Deloitte called the Targeted Compliance Framework. Since 2018, it has automatically penalized recipients of social benefits if they are suspected of not having fulfilled certain conditions by temporarily withdrawing their benefits. However, the system has always incorrectly implemented the underlying law and the ministry's guidelines. In addition, social benefits are often stopped for too long. Conversely, there is no guarantee that benefits that have actually been stopped will not be paid out after all.

Videos by heise

In an attempt to rectify errors, another mistake was made. In the five years audited, the automatic decisions wrongly penalized at least 1371 poor Australians. However, there is also a lack of traceability, validation, risk management, and oversight. According to Deloitte, even procedures before those bodies to which affected citizens can turn are not transparent and comprehensible.

Accordingly, there is no documentation of the program logic, no reliable versioning, and no standards or benchmarks for determining (poor) performance. The audited IT system for automated decisions on social benefits has failed. The sources invented by GPT do not change this.

(ds)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.