Share this article on:
The artificial intelligence (AI) content crackdown comes as content creators use AI to recreate victims of violent crimes.
Google has announced a new update to its harassment and cyber bullying policies, aimed at stopping content creators from using virtual recreations of minors killed in violent crimes.
The new rules come into effect on 16 January.
“On January 16,” Google said in a YouTube policy update, “we’ll begin striking content that realistically simulates deceased minors or victims of deadly or well-documented major violent events describing their death or violence experienced”.
Under the current anti-harassment rules, failure to comply with the policy results in a warning and the content being removed. The creator can also do policy training, which will see the warning expire after 90 days.
However, a second policy violation within 90 days will lead to a strike against the user. If a creator gets three strikes within a 90-day period, their channel will be terminated.
Extreme policy abuses, though, could lead to more extreme measures.
“We may terminate your channel or account for repeated violations of the Community Guidelines or Terms of Service,” Google’s harassment policy said. “We may also terminate your channel or account after a single case of severe abuse, or when the channel is dedicated to a policy violation.”
The changes to the policy are a reaction to a growing trend among true crime content creators on social media. Some creators have begun using AI-generated images and voices to recreate the victims of violent crimes, who then narrate the incident where they lost their lives. The Washington Post reported on one such instance in August 2023, noting that one such video doing the rounds on TikTok depicted the likeness of James Bulger, who was abducted and murdered in 1993 in the UK.
While TikTok has said it finds the videos “disturbing” and was deleting them as they were found, the Post reported, many were still available to watch on YouTube.
Speaking to The Daily Mirror at the time, Bulger’s mother, Denise Fergus, said that she found the videos to be “disgusting”.
“It is one thing to tell the story, I have not got a problem with that,” Fergus said. “Everyone knows the story of James anyway. But to actually put a dead child’s face, speaking about what happened to him, is absolutely disgusting. It is bringing a dead child back to life. It is wrong.”
David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.