Share this article on:
A researcher at Charles Darwin University has said his research into statements by extremist leaders has uncovered useful linguistic trends.
A new study by a linguistics researcher at Sydney’s Charles Darwin University has uncovered traceable patterns in extremist texts that could lead to the creation of detection algorithms for discovering such content online.
Linguistics lecturer Dr Awni Etaywe (pictured) studied statements by two extremist leaders – Boko Haram’s Abubakar Shekau and ISIS’ Abu Bakr al-Baghdadi – and was able to determine certain markers and trends based on the identity, emotion, and social values.
“These linguistic markers and contexts allow us to uncover which aspects of identity are attacked and for which function in that specific situation,” Etaywe said in a statement from the university.
“Identity attacks are not just random insults. They are carefully crafted rhetorical tools. They operate by othering and targeting personal traits and behaviours, emphasising power-distance relationships, and undermining interactional roles and master identities.
“Identity attacks are crucial in early detection and assessment of potential threats, providing law enforcement and security agencies with valuable tools for identifying and intercepting violent extremists.”
Etaywe believes his research can be used not only to create algorithms that can detect extremist content but also to train analysts and assist community leaders in their efforts to counter violent rhetoric.
“Some might dismiss the importance of linguistic analysis in counterterrorism, but understanding the language used by extremists is vital,” Etaywe said.
“It reveals how identity attacks are strategically constructed to manipulate, dehumanise, and incite violence against specific groups. By uncovering these linguistic strategies, we can better predict and mitigate the threats posed by extremist communications, ultimately saving lives and promoting social cohesion.”
Etaywe said that the speed at which extremist rhetoric can spread online is a key challenge.
“Extremists incite, threaten, forge social (dis)alignments, propagate hatred, and more through language. These texts often spread rapidly online, reaching and radicalising vulnerable individuals,” Etaywe said.
“By analysing the language, we can develop more effective strategies to counteract extremist narratives and prevent the spread of harmful ideologies.”
David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.