You have 0 free articles left this month.
Register for a free account to access unlimited free content.
Powered by MOMENTUM MEDIA
lawyers weekly logo

Powered by MOMENTUMMEDIA

For breaking news and daily updates, subscribe to our newsletter.
Advertisement

eSafety Commissioner warns schools and carers of rising impact of deepfakes among students

Reports of image-based abuse of school-aged children doubled in the last 18 months, as eSafety launches a new toolkit for schools on how to handle deepfake incidents.

eSafety Commissioner warns schools and carers of rising impact of deepfakes among students
expand image

Julie Inman-Grant, Australia’s eSafety commissioner, has written to education ministers around Australia to make sure that schools are aware of child protection legislation and reporting obligations, as reports of abusive deepfakes circulating between students continue to rise.

As nudify apps – AI-powered applications that can create a fake “nude” image of someone regardless of how fully dressed they are – become more available and easier to use, reports of deepfakes targeting under-18s have doubled in the last 18 months.

“Creating an intimate image of someone under the age of 18 is illegal. This includes the use of AI tools which are being used to create deepfake image-based abuse and synthetic child sexual exploitation material (CSEM),” Inman-Grant (pictured) said in a 27 June post to LinkedIn.

“Parents and carers can help educate their children that this behaviour can lead to criminal charges. I’m also calling on schools to report allegations of a criminal nature, including deepfake abuse of under-aged students, to police and to make sure their communities are aware that eSafety is on standby to remove this material quickly – with a 98 per cent success rate.

“It is clear from what is already in the public domain, and from what we are hearing directly from the education sector, that this is not always happening.”

 
 

To assist schools in combating deepfakes, eSafety has released a toolkit outlining how schools and educators can prepare to handle the situation, how to engage with school communities, educate students, staff, and carers, and respond when malicious deepfakes have been reported by either staff or students.

In addition, eSafety has also released an advisory on the subject, outlining what deepfakes are, how they can be harmful or even illegal, and how to respond to both the creators and victims of deepfake-based image abuse.

“Our response guide helps schools prepare for and manage deepfake incidents, taking into account the distress and lasting harms these can cause to those targeted,” Inman-Grant said.

“It also encourages schools to openly communicate their online safety policies and procedures, and the potential for serious consequences, including criminal charges in some instances for perpetrators who may be creating synthetic [child sexual exploitation material].”


eSafety encourages anyone who has been a victim of image-based abuse to report it at www.esafety.gov.au.

David Hollingworth

David Hollingworth

David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.

You need to be a member to post comments. Become a member for free today!

newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.