Powered by MOMENTUM MEDIA
cyber daily logo

Breaking news and updates daily. Subscribe to our Newsletter

Breaking news and updates daily. Subscribe to our Newsletter X facebook linkedin Instagram Instagram

Home Affairs blocks ChatGPT use within government

The Department of Home Affairs has placed a block on ChatGPT for public servants, as it remains hopeful that a whole-of-government position on the AI tool will be established.

user icon Daniel Croft
Tue, 23 May 2023
Home Affairs blocks ChatGPT use within government
expand image

Home Affairs Secretary Mike Pezzullo has said that there were concerns surrounding the way that ChatGPT, as a third-party entity, conducted data and said that the suspension was to prevent the use of artificial intelligence (AI) without direct evaluation and approval.

“Managing something corporately where you might have a proprietary engagement, where you know where the data’s stored and what limitations are placed on the data is one thing,” said Pezzullo at a Senate estimates on Monday (22 May).

“The question is whether, as a department sitting within the broader government, we’re willing … I must say, I’m not at the moment.

============
============

“That’s why I’ve directed a suspension.”

Pezzullo added that the block was put in place to prevent decisions being made by individuals to use ChatGPT out of convenience, a scenario which would not meet a high enough standard of security.

“I don’t want a permissive situation where an officer can individually decide, without any safeguards, to use this technology because it’ll make [their] day go faster,” he said.

Pezzullo said that there are indeed scenarios in which machine learning and large language models have a place in helping the department. He said that they were typically purchased “through proprietary arrangements”, allowing them to be controlled. However, his concern remained with open-source technologies like ChatGPT, which cannot be controlled.

The block is not a blanket ban on the technology, and those who propose the use of the AI to experiment and assist them are able to seek specific approval to do so.

“We have initially, at this point in time, blocked it, and then parts of the department can seek a business case to access that capability, and have done so to this point in time, noting there’s certainly some value in exploring the capabilities as a tool for experimentation and learning and looking at utility for innovation and the like,” said Justine Saunders, Home Affairs chief operating officer.

Saunders added that it was of utmost importance not to incorporate any “information relating to the department” into questions and prompts being fed to ChatGPT.

While the Home Affairs secretary said he would not take the idea of permanently instating the block on ChatGPT use off the table, he said that a government framework that outlines the procedures and security standards surrounding its use would be an appropriate solution.

Saunders added that the use of ChatGPT is a “whole-of-government” concern.

“It needs to be considered in Australian government along with private sector continually evaluating emerging technologies and assessing both their potential and the risks associated with their use in the public sector,” Saunders said.

The block comes just following calls for government regulation of AI development by the CEO of ChatGPT maker Open AI, Sam Altman.

Altman faced Congress last week, expressing that the rapid development of a powerful technology such as AI can have detrimental consequences if not properly regulated.

“I think if this technology goes wrong, it can go quite wrong, and we want to be quite vocal about that; we want to work with the government to prevent that from happening,” said Altman.

“We try to be very clear about what the downside case is and the work that we have to do to mitigate that.”

While backing the security measures that OpenAI have put in place with the development of ChatGPT, Altman said that new enforceable frameworks could be established by the US government, requiring AI models to meet specific safety standards while being evaluated by independent auditors and having to pass specific test models before launch.

Daniel Croft

Daniel Croft

Born in the heart of Western Sydney, Daniel Croft is a passionate journalist with an understanding for and experience writing in the technology space. Having studied at Macquarie University, he joined Momentum Media in 2022, writing across a number of publications including Australian Aviation, Cyber Security Connect and Defence Connect. Outside of writing, Daniel has a keen interest in music, and spends his time playing in bands around Sydney.

cd intro podcast

Introducing Cyber Daily, the new name for Cyber Security Connect

Click here to learn all about it
newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.