Share this article on:
Powered by MOMENTUMMEDIA
For breaking news and daily updates,
subscribe to our newsletter.
EFA conjures the spectre of robodebt and says ChatGPT 4.o is still not fit-for-purpose when it comes to government decision making.
Technology policy not-for-profit Electronic Frontiers Australia’s (EFA) chair has expressed serious concerns regarding the rollout of ChatGPT 4.o into a pilot program for the Australian Public Service.
“ChatGPT 4.o, at this stage, is still under development and is not yet ready, in EFA’s opinion, to assist with government decision making,” EFA chair John Pane said in a 5 September statement.
“Part of the fundamental problem with adopting AI to make decisions is that the law is designed for human decision-makers. If an AI tool forms part of government decision making in any way, it could expose the government to legal actions similar to the robodebt administrative law actions.”
The rollout is part of a government program known as GovAI, which is led by the Department of Finance, and was already several months in when reported by ITNews in July. As of then, the pilot was in a closed beta phase, with the program expected to expand by November 2025.
ITNews based its reporting on a now-deleted document published by the Australian Public Service Academy.
EFA’s concern is that if a decision is challenged under the Administrative Decisions (Judicial Review) Act 1977, the decision-maker would normally be called upon to effectively illustrate how that decision was reached and the premises upon which it was made.
“In cases where a decision, or part of a decision, is made by AI, the decision-maker would need to be able to explain the weightings assigned by the algorithm to various considerations, as well as the precedents the AI relied on and the data it was trained with,” Pane said.
“AI, including ChatGPT 4.o, has been notoriously bad at interpreting legislation and the law.”
Pane points out that Australian lawyers have already been sanctioned for using AI-assisted legal software that, in effect, made up citations to support their case. According to Lawyers Weekly, the “solicitor, who cannot be named, has had his ability to practise as a principal lawyer taken away, following an investigation by the Victorian Legal Services Board and Commissioner”.
The EFA contends that any decision that impacts a member of the public could become a potential legal case; with that in mind, it’s unclear how the federal government has developed such trust in the technology, and whether the Solicitor-General provided the government with any advice on the matter.
“EFA asks: Will the government be transparent and accountable in its use of ChatGPT 4.o? Where are the procurement documents? What use cases are currently in deployment? What high risks were identified during deployment, and how were they mitigated?” Pane said.
“Let’s not forget robodebt, while the ink is still wet on the government’s settlement cheque …”
The federal government recently agreed to pay $475 million in compensation to victims of the robodebt scandal.
David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.
Be the first to hear the latest developments in the cyber industry.