Share this article on:
Powered by MOMENTUMMEDIA
For breaking news and daily updates,
subscribe to our newsletter.
The Australian eSafety Commissioner has called out a number of technology and social media giants for their lack of effort in fighting the growing amount of child sexual abuse material (CSAM).
In the commission’s latest transparency report, it was revealed that tech giants like Apple, Google, Meta, Microsoft, and more had made little to no progress in tackling the growing presence of CSAM, despite eSafety previously raising concerns in 2022 and 2023.
Furthermore, eSafety administered “legally enforceable periodic transparency notices under Australia’s Online Safety Act”, which would require companies to report to eSafety every six months for a two-year period and report on how CSAM is being tackled. The notices were sent to Apple, Google, Meta, Microsoft, Discord, WhatsApp, Snap (Snapchat), and Skype.
However, eSafety commissioner Julie Inman Grant has said that despite the notices, the companies have not made a considerable effort to curb the material.
“This latest report shows many of the same safety gaps and shortcomings we uncovered in our 2022 and 2023 reports still exist today without any meaningful or tangible action being taken to prevent the most depraved abuse of children and young people online,” said Inman Grant.
“It is clear to me that during the last two to three years since we asked these companies how they are tackling online child sexual abuse, they haven’t taken many steps to lift and improve their efforts here, despite the promise of AI to tackle these harms and overwhelming evidence that online child sexual exploitation is on the rise.”
She also said that neither Apple nor YouTube, which is Google-owned, disclosed how many user reports they received regarding CSAM, or how many trust and safety staff they have, despite being asked.
“It shows that when left to their own devices, these companies aren’t prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services. We need to keep the pressure on the tech industry as a whole to live up to their responsibility to protect society’s most vulnerable members from the most egregious forms of harm, and that’s what these periodic notices are designed to encourage,” added Inman Grant.
The report, which noted a number of discrepancies, found that none of the providers used tools for detecting CSAM during live streams, while each other company failed to deal with other issues, like blocking URL links to known CSAM material, using tools to detect grooming, and more.
“No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises, or services,” Inman Grant said.
“This group of eight companies are required to report to me every six months, and in that time, I hope and expect to see some meaningful progress in making their services safer for children.”
eSafety did, however, highlight that a number of the firms improved in ways, with Discord, Microsoft and WhatsApp increasing their use of hash-matching tools to detect CSAM, among other improvements with Skype, Apple and Snap.
“While we welcome these improvements, more can and should be done,” said Inman Grant.
“eSafety will continue to shine a light on this issue, highlighting shortcomings and also improvements, to raise safety standards across the industry and protect the world’s children, including through mandatory industry codes and standards.”
eSafety is set to release a second report early next year.
Be the first to hear the latest developments in the cyber industry.