Share this article on:
Powered by MOMENTUMMEDIA
For breaking news and daily updates,
subscribe to our newsletter.
The Australian government has announced that it will be moving to ban what it refers to as AI deepfake “nudify” apps and other stalking programs.
AI developments have made deepfake applications, which allow for the creation of realistic AI images using faces and image input, increasingly easy to access.
While the technology can be used harmlessly, it also presents a real concern in the hands of criminals who would use it for fraud or the generation of AI child sexual abuse material (CSAM).
“The rapid acceleration and proliferation of these really powerful AI technologies is quite astounding. You don’t need thousands of images of the person or vast amounts of computing power … You can just harvest images from social media and tell the app an age and a body type, and it spits an image out in a few seconds,” said Australian eSafety commissioner Julie Inman Grant last year.
Now, after a long period of government interest, the Albanese government has announced that it will be taking action against the applications beyond existing legislation, which prohibits the distribution of non-consensual sexually explicit material and stalking.
“There is a place for AI and legitimate tracking technology in Australia, but there is no place for apps and technologies that are used solely to abuse, humiliate and harm people, especially our children,” said Communications Minister Anika Wells on Tuesday (2 September).
“That’s why the Albanese government will use every lever at our disposal to restrict access to nudification and undetectable online stalking apps and keep Australians safer from the serious harms they cause.”
Wells added that the government would work with industry on the best way to tackle these apps.
“These new, evolving technologies require a new, proactive approach to harm prevention – and we’ll work closely with industry to achieve this.
“While this move won’t eliminate the problem of abusive technology in one fell swoop, alongside existing laws and our world-leading online safety reforms, it will make a real difference in protecting Australians,” she said.
Just like the under-16 social media ban, Wells said the onus will be on tech platforms to prevent AI deepfake and nudifying tools from being available.
The Digital Industry Group Inc (DIGI) has welcomed the government’s move to ban these apps.
“DIGI welcomes strong action from Minister Anika Wells against nudification apps and online stalking tools to strengthen online safety protections for Australians,” said Dr Jennifer Duxbury, director of regulatory affairs, policy, and research at DIGI.
“DIGI members are also taking proactive steps against these types of harmful services that have no place in our online ecosystem.
“We support ecosystem approaches to tackling harm and look forward to working constructively with the government on the details of the proposal.”
Legislation to eliminate the applications was first proposed by independent MP Kate Chaney in July, following a roundtable on AI-facilitated child exploitation.
The legislation would prohibit the possession of these apps and make it a criminal offence with a penalty of up to 15 years in prison.
Updated - 02/09/2025: Added statement from DIGI’s director of regulatory affairs, policy, and research, Dr Jennifer Duxbury.
Be the first to hear the latest developments in the cyber industry.