Share this article on:
Powered by MOMENTUMMEDIA
For breaking news and daily updates,
subscribe to our newsletter.
The Australian government has ruled out allowing tech giants to train their AI models on the creative works of Australian musicians and other creatives without permission.
In what is a massive win for Australian creatives, the Albanese government has said that tech companies and AI developers like OpenAI will not be exempt from copyright legislation, meaning they will be unable to train their models on Australian creative works without consent.
“The Albanese government [is] a strong supporter of our arts community, and we recognise that they deserve their fair share for their efforts in producing products, as do media organisations,” said Minister for the Environment and Water, Murray Watt.
“One of the things that we know about AI is that while it can produce enormous productivity gains and many benefits for us on a personal level, we also need to have proper safeguards in place to make sure that in this case artists, media organisations, creatives get their fair share from their work and that’s why we will not be providing an exemption for what’s known as text and data mining for AI companies.”
The announcement comes as the government’s copyright and AI reference group is due to meet this week to see if current copyright legislation is fit for purpose when it comes to AI. In light of this, Attorney-General Michelle Rowland confirmed that it would not be changing legislation to grant AI giants the ability to nonconsensually train their AI on Australian works.
The confirmation move follows the Productivity Commission in August suggesting that regulation could stunt AI growth and thus limit the Australian economy’s potential.
The commission estimated in its report that AI could add more than $116 billion to Australia’s economy, but regulations could get in the way of that economic boost.
“Adding economy-wide regulations that specifically target AI could see Australia fall behind the curve, limiting a potentially enormous growth opportunity,” commissioner Stephen King said.
“The Australian government should only apply the proposed ‘mandatory guardrails for high-risk AI’ in circumstances that lead to harms that cannot be mitigated by existing regulatory frameworks and where new technology-neutral regulation is not possible.”
The Greens accused the Productivity Commission of only thinking about the interests of big businesses and tech companies.
“The commission is using overly optimistic financial projections to dodge proper AI rules and kill off basic digital protections,” Greens Senator and digital rights spokesperson David Shoebridge said.
“The extraordinary power of international tech companies is real, and that’s an even more important reason to not let them dictate law in this country.
“The commission’s anti-regulation push reads like the corporate wish list of Amazon, Google and Apple. It flies in the face of unions and creatives fighting for workers’ rights.”
Despite the good news from the Australian government this week, Digital Rights Watch spokesperson Tom Sulston told SBS that the lack of exemption is not enough to protect Australian creatives, with social media content still at risk.
“Sure, at the moment, the AI companies are able to ingest content from social media without gathering the consent of the people who have written it. Most social media companies will have a Terms of Use that essentially say anything you put into their system they can do whatever they want with. And we don’t think that is sufficient,” he said.
“As individuals, we still control that data, so it’s important that we’re allowed to say, I don’t want this to be part of an AI training thing or used for something that I don’t want it to be used for. And then similarly, our personal and private information is also up for grabs by AI companies.”
That said, Australia has set itself apart in its decision not to allow AI giants to exploit Australian creative works.
In May, the UK government proposed a bill that would loosen copyright laws and allow AI developers to use whatever content they have lawful access to, in an effort to make the UK a global AI leader.
As part of the plan, creators would have been required to actively opt out to prevent AI firms from training their models on their content.
Speaking with the BBC, Elton John said he feels “betrayed” by the current government and prime minister, whom he has otherwise been a supporter of.
“The government are just being absolute losers, and I’m very angry about it,” he said.
“The danger is for young artists, they haven’t got the resources to keep checking or fight big tech. It’s criminal, and I feel incredibly betrayed.
“A machine ... doesn’t have a soul, doesn’t have a heart, it doesn’t have human feeling, it doesn’t have passion. Human beings, when they create something, are doing it ... to bring pleasure to lots of people.”
In a joint letter with 400 artists, Paul McCartney also slammed the government.
“We’re the people, you’re the government. You’re supposed to protect us. That’s your job,” McCartney said.
“So if you’re putting through a bill, make sure you protect the creative thinkers, the creative artists, or you’re not gonna have them. If there’s such a thing as a government, it’s their responsibility – I would think – to protect young people to try and enhance that whole thing so it works. So that these people have got job and can enhance the world with wonderful art.”
Despite the push, the UK passed the Data (Use and Access) Bill.
Most of the bill will be laid out in future legislation and thus will not take effect until at least 2026, as the government is adamant that regulating AI practices should not be dealt with in the bill, but instead with an AI bill, which may not appear until next year.
“The deadlock has now cleared – but it leaves a regulatory gap. The government has made clear it will not be drawn into regulating AI training practices via fragmented amendments. Instead, it remains committed to introducing a ‘comprehensive’ AI bill in the next parliamentary session – though that could be as late as 2026,” said the managing associate in the commercial disputes team at Addleshaw Goddard, Rebecca Newman.
“This outcome leaves the question of whether AI developers must ensure their models are trained in accordance with UK copyright law unresolved – but given the ongoing Getty trial, the answer will likely be shaped first by the courts, not Parliament.”
Be the first to hear the latest developments in the cyber industry.