You have 0 free articles left this month.
Register for a free account to access unlimited free content.
Powered by MOMENTUM MEDIA
lawyers weekly logo

Powered by MOMENTUMMEDIA

For breaking news and daily updates, subscribe to our newsletter.
Advertisement

Australian man fined $350k for creating AI deepfake pornography

In an Australian first, a man has been fined almost $350,000 after he created AI-generated pornographic deepfakes of multiple women.

Australian man fined $350k for creating AI deepfake pornography
expand image

As reported by Information Age, Australian man Anthony Rotondo was ordered by the Federal Court to pay $343,500 for the creation and distribution of 12 AI-generated pornographic deepfakes of six Australian women without their consent between late 2022 and October 2023, at which time he was arrested.

“Even though it is clearly a fake video, I feel violated, vulnerable and completely without agency,” said one victim, adding that the crimes made her feel “horrified”.

Prior to his arrest, Rotondo had been ordered to remove the deepfakes he had created from a site called MrDeepFakes.com, which has since been shut down. However, despite the notice, the man failed to comply.

 
 

After being arrested, the court said that Rotondo “acknowledged that his posting of deepfake images could cause upset and distress”, but added that he expressed no remorse as he considered making the deepfakes a “hobby” that he found “fun”. He also provided authorities with login details so that the deepfakes he had uploaded could be removed.

Just months later, in December 2023, Rotondo was found to be in contempt of court, having admitted to violating three court orders, which were for the deepfakes to be removed, for no more deepfakes to be made, and for court orders not to be published as they contained the names of the victims.

A day later, the court found that eSafety Commission staff had been forwarded 50 emails by Rotondo, one of which was the court order, while others included those from media publications.

Now, alongside the $343,500 fine, Rotondo has been ordered to also pay the eSafety Commission’s legal costs.

“This action sends a strong message about the consequences for anyone who perpetrates deepfake image-based abuse,” said an eSafety Commission spokesperson.

“eSafety remains deeply concerned by the non-consensual creation and sharing of explicit deepfake images, which can cause significant psychological and emotional distress.”

eSafety commissioner Julie Inman Grant added that the first-of-its-kind ruling set an important precedent that would hopefully prevent others from committing the same crime.

“We brought this case in 2023, and it has been filled with twists and turns, but we think this is an important test in the use of our remedial powers towards determined perpetrators and hope this serves as an important deterrent for this cruel, destructive and anti-social misuse of easily accessible AI tools,” Inman Grant said.

Last month, the Australian government announced that it will be moving to ban what it refers to as AI deepfake “nudify” apps and other stalking programs.

“The rapid acceleration and proliferation of these really powerful AI technologies is quite astounding. You don’t need thousands of images of the person or vast amounts of computing power … You can just harvest images from social media and tell the app an age and a body type, and it spits an image out in a few seconds,” said Inman Grant last year.

Now, after a long period of government interest, the Albanese government has announced that it will be taking action against the applications beyond existing legislation, which prohibits the distribution of non-consensual sexually explicit material and stalking.

“There is a place for AI and legitimate tracking technology in Australia, but there is no place for apps and technologies that are used solely to abuse, humiliate and harm people, especially our children,” said Communications Minister Anika Wells on Tuesday (2 September).

“That’s why the Albanese government will use every lever at our disposal to restrict access to nudification and undetectable online stalking apps and keep Australians safer from the serious harms they cause.”

Daniel Croft

Daniel Croft

Born in the heart of Western Sydney, Daniel Croft is a passionate journalist with an understanding for and experience writing in the technology space. Having studied at Macquarie University, he joined Momentum Media in 2022, writing across a number of publications including Australian Aviation, Cyber Security Connect and Defence Connect. Outside of writing, Daniel has a keen interest in music, and spends his time playing in bands around Sydney.
You need to be a member to post comments. Become a member for free today!

newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.