You have 0 free articles left this month.
Register for a free account to access unlimited free content.
Powered by MOMENTUM MEDIA
lawyers weekly logo

Powered by MOMENTUMMEDIA

For breaking news and daily updates, subscribe to our newsletter.
Advertisement

Shadow AI surges in Aussie businesses as 1 in 3 professionals admit to uploading sensitive data

Too many Australian businesses are wildly unprepared to assess AI risks and lack critical visibility into how AI is being used within their organisations.

Shadow AI surges in Aussie businesses as 1 in 3 professionals admit to uploading sensitive data
expand image

New research has revealed that many Australian organisations are unaware of how their employees are using artificial intelligence, with sensitive company data at risk of exposure.

A survey by independent research firm Censuswide on behalf of software-as-a-service firm Josys found that one in three Australian professionals regularly expose company data and customer information while employing unauthorised AI tools.

“Shadow AI is no longer a fringe issue. It’s a looming full-scale governance failure unfolding in real time across Australian workplaces,” Jun Yokote, COO and president of Josys International, said in a recent statement.

 
 

Worryingly, the survey of 500 Australian technology decision-makers found that 70 per cent of companies had either no or only moderate visibility into the AI tools being used by their workers, while only 30 per cent of smaller companies, with less than 250 employees, reported feeling completely confident in being able to assess AI-related risk.

Similarly, 63 per cent of workers lack confidence in being able to use AI in a secure manner. The survey also found that roughly only half of companies in sectors with strict regulations, such as finance and healthcare, reported they were fully prepared for AI to become part of their workflows.

Compliance is a sore point for many organisations. Forty-seven per cent of Australian companies cited Privacy Act amendments and AI transparency requirements as a compliance issue. Half of the organisations surveyed still rely on manual policy reviews, while a third lack AI governance frameworks.

“While the nation is racing to harness AI for increased productivity, without governance, that momentum quickly turns into risk. Productivity gains mean nothing if they come at the cost of trust, compliance, and control. What’s needed is a unified approach to AI governance which combines visibility, policy enforcement, and automation in a single, scalable framework,” Yokote said.

You can read Josys’ full report here.

David Hollingworth

David Hollingworth

David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.

You need to be a member to post comments. Become a member for free today!

newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.