Regardless of origins, any community within social media platforms are constantly adapting to changes in their preferred landscape. However, when a change in leadership started a snowball into the erasure, replacement and transformation of what was previously known as Twitter, communities found themselves adjusting to changes on a much larger scale. When entrepreneur Elon Musk purchased Twitter, now named ‘X’, in 2022, his quick dismissal of a large fraction of company staff, unbanning of accounts previously banned for mass harm, hatred and misinformation and monetisation of basic platform functions left the future of the platform uncertain (Ghazzawi, 2024; Poon, 2024; Molden, 2023). What followed was one of the most controversial brand transformations in history, built upon monetisation, tolerance towards hatred and the upheaval of a platform with a legacy of over a decade. What was once a platform that thrived on social connections through multimedia microblogging has turned into a space in which far-right extremism runs rampant. The X platform has become a prime example of how far-right extremism finds space online due to the moderation changes and efforts to connect to extremist groups made by Musk since his acquisition of the platform.
Moderation changes paving the way for far-right extremists
Policy and moderation changes made by Musk since his 2022 acquisition have led to a surge in extremist behaviour: most notably, hate speech against minority groups. Extremist viewpoints, behaviours and groups have always found residency on the Internet, but an understanding of what constitutes extremism is essential to furthering this discussion. Extremism, while a difficult and broad concept to pinpoint, refers to any individual willing to go to extreme actions for a particular cause (Cassam, 2020). However, this definition is broad, subjective and does not necessarily describe extremism as being problematic, violent or impactful. Cassam proposes a categorisation of extremism, suggesting that there are at least three subtypes of extremism: methodical, ideological and psychological (Cassam, 2020). Most significant to the case of X is ideological extremism, referring to extremist ideas and concepts, either preluding actions or in replacement of them (Cassam, 2020). The extremist ideology in which Musk has created space for through his actions as the owner of X is the far-right demographic, referring to individuals holding extreme editions of westernised right-wing political beliefs. The creation of space for this group started even prior to Musk acquiring the platform, in his statements of intent to prevent the platform from becoming “too woke” (Benton et al, 2022, p. 3). The usage of the term ‘woke’ to describe states of censorship, political climates and communities has roots within American political landscapes, stemming from right-wing spaces to describe those with progressive ideologies or a disregard for traditional political views (Vogelaar, 2024). These terms as used by Musk resonated with right-wing communities, who quickly celebrated when Musk undertook actions loyal to these intentions (Benton et al, 2022). A notable action during this period was Musk forcibly dissolving the Twitter Trust and Safety Council, staying loyal to his commitment to reduce moderation and ‘censorship’ on the platform (O’Brien & Ortutay, 2022). While the Twitter Trust and Safety Council did not hold formal moderation authority, they were a carefully considered volunteer group containing over one hundred individuals and groups dedicated to addressing child safety, suicide, self-harm and hate speech on the platform (O’Brien & Ortutay, 2022). This reduction was made amongst the many that Musk implemented upon acquiring the platform, overall reducing the number of staff by close to half in his newfound ownership of X (Ghazzawi, 2024). With a reduction in surveillance on the platform, Musk projected an image that users would no longer be constantly watched, surveyed and ‘censored’, to which users celebrated by expressing ideas, feelings and statements that they were not able to under the previous platform owner: particularly hate speech.
An upwards trend in hate speech is observable from the very first day of Musk as the owner and CEO of the X platform. Through analysis of a sample of popular posts containing at least one derogatory term or slur, much of this content was found to have been authored after Musk acquired the platform, dating back to the first day of his ownership (Keane, 2024). However, Keane notes that the algorithm used to flag these posts as hate speech were using a dictionary of words, which disregards the alternate usages of slurs and derogatory terms online. This sort of algorithm presents many opportunities for discussion surrounding the usage of slurs as reclamation by minority groups, typos, posts made in languages other than English as well as users simply testing the strictness of the new moderation, which does not necessarily pertain to hatred. This study, however, does highlight a separate issue created by Musk, that being the limitation of modern studies, investigations and analysis of the platform. In 2023, Musk made the decision to revoke free access to the Twitter API, introducing a new subscription service to gain access (Browne, 2023).As of this change, users must now state their intended uses for the information within the developer portal, as well as pay a subscription fee ranging from hundreds to tens of thousands of dollars per month for developer portal and API access (X, n.d.). This decision significant impacted efforts to research the platform, halting more than one hundred academic studies indefinitely (Gotfredsen, 2023). These new conditions have resulted in a rare case of previous research efforts being more thorough compared to modern attempts. Thus, referring to research dated before this paywall further solidifies the case that hate speech has risen specifically under Musk’s leadership.
In 2022, as the upwards trend was developing and research was less obstructed, hate speech was able to be detected using an algorithm that analysed context, connotations and usages of these terms to finely delegate a post as hate speech (Benton et al, 2022). Not only did this study observe a similar spike in hate speech, but it showed that over sixty seven percent of usages were in a context where applicable marginalised groups were referred to negatively (Benten et al, 2022). This is a significant increase compared to before Musk acquired X, which suggests changes in moderation, cultures or both. It also strengthens modern research cases, which while more limited in context analysis, display strikingly similar results to prior studies. With the moderation changes being declared and confirmed by both platform staff and Musk himself, there is little room to speculate on whether these results correlate. This research also implies towards an increase in hateful behaviours, communities and cultures since the acquisition by Musk, which is further correlated by a notable demographic that celebrated the acquisition.
Content from a leader figure connecting with far-right extremist audiences
Musk acquiring the X platform has led to a unification and surge in far-right, extremist communities due to Musk resonating with this demographic through his content on the site. To begin with, Musk has an established history of aligning himself with far-right community groups, their causes and values. In a less direct fashion, Musk has a documented past of using language, imagery and concepts that resonate with the far-right community. One such incident can be found within the controversial decision to visually compare Canadian leader Justin Trudeau to Adolf Hitler in a now-deleted post on his platform, which was condemned by the Auschwitz Museum as exploitative and “disrespectful” (Neate, 2022). This incident is made even more significant when analysing the context in which it was made in. Musk made this comparison in direct response to a report that Canadian authorities were compromising the ability to donate to anti-vaccination causes through the means of cryptocurrency (Neate, 2022). This was because at the time, Canada was experiencing major economical delays and issues because of anti-vaccination protests from truckers (Neate, 2022). Prior to this comparison being made, Musk had implied his alliance with these protestors before, slowly securing his position as an ally to an issue that ring-wing communities find significant (Neate, 2022). Other notable instances include his frequent usage of the term ‘woke mind virus’ to refer to individuals who think or live progressively, particularly in matters of defending minority groups (Sagastume Muralles, 2024). Once again weaponising the term ‘woke’ – a popular term of dismissal or demonisation within right-wing spaces – this phrase implies that anybody who disagrees with his worldviews is sick, infected or otherwise unfit to be asserting views. While these passive efforts are the foundation for speculation about his potential far-right beliefs, the more direct activities of Musk since acquiring the platform strengthen the argument that he is interested in aligning with far-right communities. Since acquiring the platform, Musk has made several posts containing well-known dog whistles for white supremacy and Nazism. In particular, the dog whistles ‘88’ and ‘14’ have been used in posts made by Musk, with the former meaning ‘Heil Hitler’ and the latter referring to a phrase used by a notable white supremacy group (Gilbert, 2022; Weimann & Am, 2020). While it can be argued that these numbers were arbitrary figures thrown into mundane posts by Musk, the intention does not nullify the dialogues that these figures have birthed on the platform. Regardless of the intention behind these numbers, far-right, ideological extremists across the web have found resonance in his posts, celebrating his usage of these terms openly (Gilbert, 2022). With many of these celebrations remaining visible on the platform to date, the lack of action, consideration or moderation on these matters directly correlates with the promises Musk made to allow for ‘free speech’, regardless of political beliefs. Musk creating this content and subsequently allowing extremist commentaries and celebrations around it has undoubtedly contributed to a larger scale problem of ideologically extreme behaviours and communities across the platform.
Conclusion
Overall, the acquisition of Twitter – now X – by Elon Musk marks a significant turning point in the presence of far-right extremism on the platform. When Musk purchased the platform in 2022, his quick mass layoffs, structural changes and policy revisions within the company suggested that the platform was about to undergo rapid changes. This was proven to be true when Musk dismissed a large fraction of platform staff and forcibly dissolved existing resources and safety measures such as the Twitter Trust and Safety Council. The effects of these decisions have resulted in a higher presence in hate speech and content, contributing to a normalisation of this behaviour under the guise of ‘free speech’. While these behaviours are not unfamiliar to internet spaces, these communities being invited, accepted and entertained on such a large scale, but in such a short time, secures the case of Elon Musk and X as significant. Additionally, the usage and allowance of content appealing to far-right extremists contributes to the growing presence of far-right extremism on the platform. Using digital dog whistles, intentionally or not, has led to a high concentration of discussions and celebrations around Nazism, fascism and other far-right ideological extremes. Elon Musk, X and his governing of the platform holds significant opportunities to explore the influence platform owners hold, and how readily masses will respond to this influence. Particularly, this case reveals a certain fragility to online communities that revolves around the responsibility – or lack thereof – of those governing the spaces communities gather in. By ‘unmoderating’ a platform to the benefit of ideological extremists, Musk has singlehandedly created a safe space for unsafe behaviours and communities to take shelter, which could have devastating consequences should this extremism leave the ideological plane.
References
Benton, B., Choi, J-A., Luo, Y., & Green, K. (2022). Hate speech spikes on Twitter after Elon Musk acquires the platform. School of Communication and Media, Montclair State University, 33. https://digitalcommons.montclair.edu/scom-facpubs/33
Browne, R. (2022, February 2). Twitter will start charging developers for API access as Elon Musk seeks to drive revenue. CNBC. https://www.cnbc.com/2023/02/02/twitter-to-start-charging-developers-for-api-access.html#
Cassam, Q. (2021). Extremism: A philosophical analysis. Routledge. https://doi.org/10.4324/9780429325472
Gilbert, D. (2022, November 29). Elon Musk is turning twitter into a haven for nazis. Vice. https://www.vice.com/en/article/elon-musk-twitter-nazis-white-supremacy/
Ghazzawi, I. (2024). At the helm of Twitter: The Leadership Style of Elon Musk. Journal of Case Research and Inquiry, 9(1), 78-107. https://www.researchgate.net/publication/379664239_AT_THE_HELM_OF_TWITTER_THE_LEADERSHIP_STYLE_OF_ELON_MUSK
Gotfredsen, S. G. (2023, December 6). Q&A: What happened to academic research on Twitter? Colombia Journalism Review. https://www.cjr.org/tow_center/qa-what-happened-to-academic-research-on-twitter.php
Keane, R. (2024). Elon Musk’s purchase of Twitter and its effect on hate speech impressions. The National High School Journal of Science, Advance online publication. https://nhsjs.com/2024/elon-musks-purchase-of-twitter-and-its-effect-on-hate-speech-impressions/
Molden, D. (2023). A tale told by an idiot: Elon Musk and Free Speech on Twitter. Aichi Shukutoku University Bulletin, 7(1), 15-24. https://aska-r.repo.nii.ac.jp/records/8672
Neate, R. (2022, February 18). Elon Musk criticised for likening Justin Trudeau to Adolf Hitler in tweet. The Guardian. https://www.theguardian.com/technology/2022/feb/17/elon-musk-criticised-for-comparing-justin-trudeau-to-adolf-hitler-tweet-auschwitz
O’Brien, M., & Ortutay, B. (2022, December 13). Musk’s Twitter dissolves Trust and Safety Council. The National News Desk. https://thenationaldesk.com/news/americas-news-now/musks-twitter-dissolves-trust-and-safety-council-elon-twitter-files-social-media-platform-independent-civil-human-rights-hate-speech-child-exploitation-suicide-self-harm?photo=3
Poon, C. V. (2024). A social network for misinformation and hate: Twitter After Elon Musk. Capstone. https://capstone.capilanou.ca/2024/02/04/social-network-for-misinformation-and-hate-twitter-after-elon-musk/
Sagastume Muralles, J. (2024). Visions and values of algorithms: A study of how Elon Musk envisions X’s algorithms [Master’s thesis, Lund University]. Lund University Libraries. https://lup.lub.lu.se/student-papers/search/publication/9151492
Vogel, J. (2023). Right To Woke [Master’s thesis, Erasmus University Rotterdam]. Erasmus University Rotterdam. https://thesis.eur.nl/pub/75520/eobs_108612.pdf
Weimann, G., & Am, A. B. (2020). Digital dog whistles: The new online language of extremism. International Journal of Security Studies, 2(1), 4. https://www.academia.edu/download/83354808/viewcontent.pdf
X. (n.d.). X Developers. X. https://developer.x.com/en/portal/petition/essential/basic-info
Hi Shannon Kate, You’re right to ask; it is incredibly difficult to police these issues today. Predatory behaviour isn’t exclusive…