Share this article on:
Breaking news and updates daily. Subscribe to our Newsletter
The Elon Musk-owned social media platform has earned the commissioner’s ire over a lack of response to child sexual exploitation material being shared on the platform.
eSafety sent legal notices to Twitter (since renamed to X) – alongside Google, TikTok, Twitch and Discord – in February 2023, hoping for answers explaining how the company plans to tackle the growing amount of exploitative material on the platform. It has now released a report on the responses it received.
“Our first report featuring Apple, Meta, Microsoft, Skype, Snap, WhatsApp and Omegle uncovered serious shortfalls in how these companies were tackling this issue,” said eSafety Commissioner Julie Inman Grant in a statement today (16 October).
“This latest report also reveals similar gaps in how these five tech companies are dealing with the problem and how they are tackling the rise in sexual extortion, and we need them all to do better,” Grant said.
“What we are talking about here are serious crimes playing out on these platforms committed by predatory adults against innocent children, and the community expects every tech company to be taking meaningful action.”
Google was issued a formal warning after offering “a number of generic responses to specific questions” regarding its practices to tackle child exploitation material, but X’s non-compliance was far more serious, according to eSafety.
In fact, it failed to respond at all to “a number of key questions” regarding its time to respond to complaints and the measures in place to detect exploitation material in live streams. In addition, X also failed to “adequately answer questions relating to the number of safety and public policy staff” still employed by the company.
X has subsequently been fined $610,500 for its non-compliance and has 28 days to pay up. If it fails to pay, the eSafety Commissioner has said other actions will be taken.
“Twitter/X has stated publicly that tackling child sexual exploitation is the number one priority for the company, but it can’t just be empty talk,” Grant said, “we need to see words backed up with tangible action”.
“If Twitter/X and Google can’t come up with answers to key questions about how they are tackling child sexual exploitation, they either don’t want to answer for how it might be perceived publicly or they need better systems to scrutinise their own operations. Both scenarios are concerning to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.”
The Office of the eSafety Commissioner has previously taken Twitter/X to task over the rise of hate speech in the lead-up to the recently failed Voice referendum and how it handles cyber bullying and hate speech in general.
Jacqueline Jayne, Security Awareness Advocate for the APAC region at KnowBe4, is honestly amazed by X's behaviour.
"I can’t believe that in 2023 one of the world’s largest social media platforms has to be taken to court for such matters," Jayne told Cyber Daily via email.
"What world are we living in where we are fighting corporations to do something about child sexual abuse material (CSAM)?
"These organisations Google, Discord, X (Twitter) etc. should not have to be asked once to block known illegal content such as this. They are fast to block other random stuff nowhere near as deplorable or harmful as CSAM.
"While I am proud that the Australian eSafety Commission has taken the lead to introduce laws that compel organisations to explain what they are doing to combat online child exploitation and abuse, I am equally angry that they should have to do it."
Comments powered by CComment