USC Hate Speech and Extremism Online Study Focuses on USC News
Researchers at USC Dornsife College of Letters, Arts and Sciences hypothesized that a high level of consistency in similar moral concerns within online communities is linked to an increase in radical intentions and extremism. , that is, the willingness to participate in illegal or violent political actions. .
In research published this week in Social Psychological and Personality Science, they found that the degree of shared moral concerns or “moral convergence” within an online group predicts the number of hate speech messages posted by members.
“Our research team examined how morality motivates people to engage in various types of behavior, from giving in a disaster to taking extreme action, even violence, to protect their group,” said the author. principal of the study, Mohammad Atari, who recently completed his doctorate. in the Department of Psychology at USC Dornsife and is now a Postdoctoral Fellow at Harvard University. “They feel like other people are doing something morally wrong and it is their sacred duty to do something about it, even if it means posting hate speech and committing hate crimes.”
Scientists first analyzed the posts on an alternative social media network popular with right-wing and right-wing extremists called Gab. The platform, which claims to uphold free speech and is not moderate for hate speech, offered researchers a unique opportunity to study the dynamics that could lead to radicalization.
They found that Gab users who had a similar moral profile to their immediate group – meaning they shared values and similarly felt basic moral issues, including caring, fairness, etc. loyalty, purity and authority – were more likely to disseminate hate speech and use language intended to dehumanize or even call for violence against members of the outgroup.
Extremism on social networks linked to shared values and morality
The researchers replicated the observations of the Gab study by examining another extremist network in the online community Reddit. They analyzed a subreddit called “Incels” – unintentional single men who blame women for their inability to find sex partners – and found that those who were morally like-minded produced more hateful and misogynistic speech.
Working together, scientists from USC and other institutions developed a model for detecting moralized language a few years ago. It is based on a prior deep learning framework for a computer program capable of reliably identifying a text evoking moral concerns associated with different types of moral values and their opposites. Values, as defined by Moral Foundations Theory, focus on care / evil, fairness / cheating, loyalty / betrayal, authority / subversion, and purity / degradation.
Moral Foundations Theory is a theory in social and cultural psychology that explains the evolutionary origins of human moral intuitions based on innate feelings rather than logical reasoning.
Morality binds us together and gives structure and direction to our society … But morality also has a dark side.
Morteza Dehghani, USC Dornsife Associate Professor of Psychology and Computer Science
“Morality binds us together and gives our society a structure and direction to care for those in need, as well as a vision of a just and prosperous future for the group. But there is a dark side to morality too, in that its extreme forms can lead to the opposite of many of these positive principles, ”said Morteza Dehghani, associate professor of psychology and computer science. He runs the Computational Social Science Lab at USC Dornsife, where he and others study how morality intertwines with prejudice and hatred.
Social media platforms help breed extremism and allow extremists to find each other and, as Dehghani describes, “mutually fuel their worldviews and their anger towards the outgroup.”
Experimental studies further revealed the role of morality in online extremism
In three controlled experimental studies, the research team further demonstrated that tricking people into believing that other members of their hypothetical or real group shared their views on moral issues increased their radical intentions to protect the group. at any cost, even by resorting to violent means. When American study participants were made to believe that other Americans shared their moral views, they became more willing to “fight and die” for their country and the values it stands for.
“These findings highlight the role of moral convergence and family-like bonds in radicalization, highlighting the need for a diversity of moral worldviews within social networks,” Atari said.
But, he admitted, it’s easier said than done. Further study is needed to determine the most effective interventions for online communities to introduce different perspectives, which may be the key to stopping radicalization.
#StoptheSteal has its roots in online radicalization
The real threats posed by online radicalization were recently illustrated by the storming of the United States Capitol on January 6. Those who were convinced that the 2020 presidential election had been stolen from former President Donald Trump organized online under the hashtag #StoptheSteal on Facebook and on Gab, which served as a hub for organizing the insurgency.
When people are motivated by morality, regardless of their political affiliation, it blurs their judgment.
Mohamed atari, lead author of the study
These radicalization studies were already well advanced before the January 6 uprising. Even so, Atari said the events of January 6 further motivated the research team trying to understand radicalization online.
He added that identifying as conservative or liberal does not necessarily predict who is predisposed to radicalization. “When people are motivated by morality, regardless of their political affiliation, it blurs their judgment,” Atari said.
USC researchers in many disciplines study political polarization and radicalization – how it begins and how it can be mitigated.
The study was funded by the National Science Foundation CAREER BCS-1846531.
More stories on: Politics, Research