Share this article on:
Powered by MOMENTUMMEDIA
For breaking news and daily updates,
subscribe to our newsletter.
Independent MP Kate Chaney plans to introduce the bill, saying: “This is a clear gap in our criminal code.”
As one federal MP has said she aims to introduce a bill to Parliament making the use of child abuse apps illegal, the CEO of the International Justice Mission has called the decision a “common sense” move.
Independent MP Kate Chaney (pictured) will introduce the bill today, 28 July, saying: “We also need to be able to nimbly respond to risks as they emerge – and this is a clear gap in our criminal code.”
Chaney’s bill would make it an offence to download applications capable of creating child abuse material, as well as to collect data to create or train technology to create such images.
“Currently, possession of these images is illegal, but it’s not illegal to possess these particular types of AI tools that are designed for the sole purpose of creating child sexual abuse material,” Chaney told ABC News Breakfast.
“So, it means that perpetrators can generate the material using images of real children, delete the images, and then recreate them whenever they want and avoid detection.”
Commenting on the proposed bill, International Justice Mission Australia CEO David Braga said it was high time for the government to finally make good on its commitment to support a “digital duty of care”.
“Generative AI child sexual abuse material is abhorrent and needs to be blocked. Much of it includes faces of actual children or was created using known child sexual abuse images and videos,” Braga said.
“Our concern is that this trains the user to see child abuse as acceptable, which it clearly is not. And because the AI-generated child sexual abuse images and videos are now almost indistinguishable from abuse images and videos created through sexual abuse of children, that it is a short-step from AI-generated content to real-world content, and also adds to the already overwhelming amount of child sexual abuse materials that law enforcement are reviewing and trying to identify real children involved in generating this.”
Braga said that it was “common sense” for the government’s “digital duty of care” legislation to cover child abuse material, and thus place the onus on digital platforms “to take reasonable steps to prevent foreseeable harms on their platforms and services”.
“Now is the time for the Australian government to strengthen the Online Safety Act to require companies across the tech stack, including operating system providers and device manufacturers, to detect and disrupt child sexual abuse material in all its forms on their platforms,” Braga said.
David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.
Be the first to hear the latest developments in the cyber industry.