Skip to content

X, Home of The Far Right: how a change in platform leadership transformed its political culture


Regardless of origins, any community within social media platforms are constantly adapting to changes in their preferred landscape. However, when a change in leadership started a snowball into the erasure, replacement and transformation of what was previously known as Twitter, communities found themselves adjusting to changes on a much larger scale. When entrepreneur Elon Musk purchased Twitter, now named ‘X’, in 2022, his quick dismissal of a large fraction of company staff, unbanning of accounts previously banned for mass harm, hatred and misinformation and monetisation of basic platform functions left the future of the platform uncertain (Ghazzawi, 2024; Poon, 2024; Molden, 2023). What followed was one of the most controversial brand transformations in history, built upon monetisation, tolerance towards hatred and the upheaval of a platform with a legacy of over a decade. What was once a platform that thrived on social connections through multimedia microblogging has turned into a space in which far-right extremism runs rampant. The X platform has become a prime example of how far-right extremism finds space online due to the moderation changes and efforts to connect to extremist groups made by Musk since his acquisition of the platform.

Moderation changes paving the way for far-right extremists

Policy and moderation changes made by Musk since his 2022 acquisition have led to a surge in extremist behaviour: most notably, hate speech against minority groups. Extremist viewpoints, behaviours and groups have always found residency on the Internet, but an understanding of what constitutes extremism is essential to furthering this discussion. Extremism, while a difficult and broad concept to pinpoint, refers to any individual willing to go to extreme actions for a particular cause (Cassam, 2020). However, this definition is broad, subjective and does not necessarily describe extremism as being problematic, violent or impactful. Cassam proposes a categorisation of extremism, suggesting that there are at least three subtypes of extremism: methodical, ideological and psychological (Cassam, 2020). Most significant to the case of X is ideological extremism, referring to extremist ideas and concepts, either preluding actions or in replacement of them (Cassam, 2020). The extremist ideology in which Musk has created space for through his actions as the owner of X is the far-right demographic, referring to individuals holding extreme editions of westernised right-wing political beliefs. The creation of space for this group started even prior to Musk acquiring the platform, in his statements of intent to prevent the platform from becoming “too woke” (Benton et al, 2022, p. 3). The usage of the term ‘woke’ to describe states of censorship, political climates and communities has roots within American political landscapes, stemming from right-wing spaces to describe those with progressive ideologies or a disregard for traditional political views (Vogelaar, 2024). These terms as used by Musk resonated with right-wing communities, who quickly celebrated when Musk undertook actions loyal to these intentions (Benton et al, 2022). A notable action during this period was Musk forcibly dissolving the Twitter Trust and Safety Council, staying loyal to his commitment to reduce moderation and ‘censorship’ on the platform (O’Brien & Ortutay, 2022). While the Twitter Trust and Safety Council did not hold formal moderation authority, they were a carefully considered volunteer group containing over one hundred individuals and groups dedicated to addressing child safety, suicide, self-harm and hate speech on the platform (O’Brien & Ortutay, 2022). This reduction was made amongst the many that Musk implemented upon acquiring the platform, overall reducing the number of staff by close to half in his newfound ownership of X (Ghazzawi, 2024). With a reduction in surveillance on the platform, Musk projected an image that users would no longer be constantly watched, surveyed and ‘censored’, to which users celebrated by expressing ideas, feelings and statements that they were not able to under the previous platform owner: particularly hate speech.

An upwards trend in hate speech is observable from the very first day of Musk as the owner and CEO of the X platform. Through analysis of a sample of popular posts containing at least one derogatory term or slur, much of this content was found to have been authored after Musk acquired the platform, dating back to the first day of his ownership (Keane, 2024). However, Keane notes that the algorithm used to flag these posts as hate speech were using a dictionary of words, which disregards the alternate usages of slurs and derogatory terms online. This sort of algorithm presents many opportunities for discussion surrounding the usage of slurs as reclamation by minority groups, typos, posts made in languages other than English as well as users simply testing the strictness of the new moderation, which does not necessarily pertain to hatred. This study, however, does highlight a separate issue created by Musk, that being the limitation of modern studies, investigations and analysis of the platform. In 2023, Musk made the decision to revoke free access to the Twitter API, introducing a new subscription service to gain access (Browne, 2023).As of this change, users must now state their intended uses for the information within the developer portal, as well as pay a subscription fee ranging from hundreds to tens of thousands of dollars per month for developer portal and API access (X, n.d.). This decision significant impacted efforts to research the platform, halting more than one hundred academic studies indefinitely (Gotfredsen, 2023). These new conditions have resulted in a rare case of previous research efforts being more thorough compared to modern attempts. Thus, referring to research dated before this paywall further solidifies the case that hate speech has risen specifically under Musk’s leadership.

In 2022, as the upwards trend was developing and research was less obstructed, hate speech was able to be detected using an algorithm that analysed context, connotations and usages of these terms to finely delegate a post as hate speech (Benton et al, 2022). Not only did this study observe a similar spike in hate speech, but it showed that over sixty seven percent of usages were in a context where applicable marginalised groups were referred to negatively (Benten et al, 2022). This is a significant increase compared to before Musk acquired X, which suggests changes in moderation, cultures or both. It also strengthens modern research cases, which while more limited in context analysis, display strikingly similar results to prior studies. With the moderation changes being declared and confirmed by both platform staff and Musk himself, there is little room to speculate on whether these results correlate. This research also implies towards an increase in hateful behaviours, communities and cultures since the acquisition by Musk, which is further correlated by a notable demographic that celebrated the acquisition.

Content from a leader figure connecting with far-right extremist audiences

Musk acquiring the X platform has led to a unification and surge in far-right, extremist communities due to Musk resonating with this demographic through his content on the site. To begin with, Musk has an established history of aligning himself with far-right community groups, their causes and values. In a less direct fashion, Musk has a documented past of using language, imagery and concepts that resonate with the far-right community. One such incident can be found within the controversial decision to visually compare Canadian leader Justin Trudeau to Adolf Hitler in a now-deleted post on his platform, which was condemned by the Auschwitz Museum as exploitative and “disrespectful” (Neate, 2022). This incident is made even more significant when analysing the context in which it was made in. Musk made this comparison in direct response to a report that Canadian authorities were compromising the ability to donate to anti-vaccination causes through the means of cryptocurrency (Neate, 2022). This was because at the time, Canada was experiencing major economical delays and issues because of anti-vaccination protests from truckers (Neate, 2022). Prior to this comparison being made, Musk had implied his alliance with these protestors before, slowly securing his position as an ally to an issue that ring-wing communities find significant (Neate, 2022). Other notable instances include his frequent usage of the term ‘woke mind virus’ to refer to individuals who think or live progressively, particularly in matters of defending minority groups (Sagastume Muralles, 2024). Once again weaponising the term ‘woke’ – a popular term of dismissal or demonisation within right-wing spaces – this phrase implies that anybody who disagrees with his worldviews is sick, infected or otherwise unfit to be asserting views. While these passive efforts are the foundation for speculation about his potential far-right beliefs, the more direct activities of Musk since acquiring the platform strengthen the argument that he is interested in aligning with far-right communities. Since acquiring the platform, Musk has made several posts containing well-known dog whistles for white supremacy and Nazism. In particular, the dog whistles ‘88’ and ‘14’ have been used in posts made by Musk, with the former meaning ‘Heil Hitler’ and the latter referring to a phrase used by a notable white supremacy group (Gilbert, 2022; Weimann & Am, 2020). While it can be argued that these numbers were arbitrary figures thrown into mundane posts by Musk, the intention does not nullify the dialogues that these figures have birthed on the platform. Regardless of the intention behind these numbers, far-right, ideological extremists across the web have found resonance in his posts, celebrating his usage of these terms openly (Gilbert, 2022). With many of these celebrations remaining visible on the platform to date, the lack of action, consideration or moderation on these matters directly correlates with the promises Musk made to allow for ‘free speech’, regardless of political beliefs. Musk creating this content and subsequently allowing extremist commentaries and celebrations around it has undoubtedly contributed to a larger scale problem of ideologically extreme behaviours and communities across the platform.

Conclusion

Overall, the acquisition of Twitter – now X – by Elon Musk marks a significant turning point in the presence of far-right extremism on the platform. When Musk purchased the platform in 2022, his quick mass layoffs, structural changes and policy revisions within the company suggested that the platform was about to undergo rapid changes. This was proven to be true when Musk dismissed a large fraction of platform staff and forcibly dissolved existing resources and safety measures such as the Twitter Trust and Safety Council. The effects of these decisions have resulted in a higher presence in hate speech and content, contributing to a normalisation of this behaviour under the guise of ‘free speech’. While these behaviours are not unfamiliar to internet spaces, these communities being invited, accepted and entertained on such a large scale, but in such a short time, secures the case of Elon Musk and X as significant. Additionally, the usage and allowance of content appealing to far-right extremists contributes to the growing presence of far-right extremism on the platform. Using digital dog whistles, intentionally or not, has led to a high concentration of discussions and celebrations around Nazism, fascism and other far-right ideological extremes. Elon Musk, X and his governing of the platform holds significant opportunities to explore the influence platform owners hold, and how readily masses will respond to this influence. Particularly, this case reveals a certain fragility to online communities that revolves around the responsibility – or lack thereof – of those governing the spaces communities gather in. By ‘unmoderating’ a platform to the benefit of ideological extremists, Musk has singlehandedly created a safe space for unsafe behaviours and communities to take shelter, which could have devastating consequences should this extremism leave the ideological plane.

References

Benton, B., Choi, J-A., Luo, Y., & Green, K. (2022). Hate speech spikes on Twitter after Elon Musk acquires the platform. School of Communication and Media, Montclair State University, 33. https://digitalcommons.montclair.edu/scom-facpubs/33

Browne, R. (2022, February 2). Twitter will start charging developers for API access as Elon Musk seeks to drive revenue. CNBC. https://www.cnbc.com/2023/02/02/twitter-to-start-charging-developers-for-api-access.html#

Cassam, Q. (2021). Extremism: A philosophical analysis. Routledge. https://doi.org/10.4324/9780429325472

Gilbert, D. (2022, November 29). Elon Musk is turning twitter into a haven for nazis. Vice. https://www.vice.com/en/article/elon-musk-twitter-nazis-white-supremacy/

Ghazzawi, I. (2024). At the helm of Twitter: The Leadership Style of Elon Musk. Journal of Case Research and Inquiry, 9(1), 78-107. https://www.researchgate.net/publication/379664239_AT_THE_HELM_OF_TWITTER_THE_LEADERSHIP_STYLE_OF_ELON_MUSK

Gotfredsen, S. G. (2023, December 6). Q&A: What happened to academic research on Twitter? Colombia Journalism Review. https://www.cjr.org/tow_center/qa-what-happened-to-academic-research-on-twitter.php

Keane, R. (2024). Elon Musk’s purchase of Twitter and its effect on hate speech impressions. The National High School Journal of Science, Advance online publication. https://nhsjs.com/2024/elon-musks-purchase-of-twitter-and-its-effect-on-hate-speech-impressions/

Molden, D. (2023). A tale told by an idiot: Elon Musk and Free Speech on Twitter. Aichi Shukutoku University Bulletin, 7(1), 15-24. https://aska-r.repo.nii.ac.jp/records/8672

Neate, R. (2022, February 18). Elon Musk criticised for likening Justin Trudeau to Adolf Hitler in tweet. The Guardian. https://www.theguardian.com/technology/2022/feb/17/elon-musk-criticised-for-comparing-justin-trudeau-to-adolf-hitler-tweet-auschwitz

O’Brien, M., & Ortutay, B. (2022, December 13). Musk’s Twitter dissolves Trust and Safety Council. The National News Desk. https://thenationaldesk.com/news/americas-news-now/musks-twitter-dissolves-trust-and-safety-council-elon-twitter-files-social-media-platform-independent-civil-human-rights-hate-speech-child-exploitation-suicide-self-harm?photo=3

Poon, C. V. (2024). A social network for misinformation and hate: Twitter After Elon Musk. Capstone. https://capstone.capilanou.ca/2024/02/04/social-network-for-misinformation-and-hate-twitter-after-elon-musk/

Sagastume Muralles, J. (2024). Visions and values of algorithms: A study of how Elon Musk envisions X’s algorithms [Master’s thesis, Lund University]. Lund University Libraries. https://lup.lub.lu.se/student-papers/search/publication/9151492

Vogel, J. (2023). Right To Woke [Master’s thesis, Erasmus University Rotterdam]. Erasmus University Rotterdam. https://thesis.eur.nl/pub/75520/eobs_108612.pdf

Weimann, G., & Am, A. B. (2020). Digital dog whistles: The new online language of extremism. International Journal of Security Studies, 2(1), 4. https://www.academia.edu/download/83354808/viewcontent.pdf

X. (n.d.). X Developers. X. https://developer.x.com/en/portal/petition/essential/basic-info

Share this:

Search Site

Your Experience

We would love to hear about your experience at our conference this year via our DCN XVI Feedback Form.

Comments

8 responses to “X, Home of The Far Right: how a change in platform leadership transformed its political culture”

  1. Suva Pokharel Avatar

    It’s fascinating how quickly a platforms overall usage and content can change from ownership transfer! As someone who recently researched Reddit’s role in discourse and community creation, I found several parallels in your paper, specifically surrounding the potential spread of harmful misinformation. I think also delving into how a user is able to remain anonymous on X, whilst not to the same extent as Reddit as X requires a mobile number to signup, would have added another layer to the narrative.

    Whilst I do agree that there has been a shift in the content on X, do you think the arrival of community notes to allow users to correct potential harmful information can combat this at all?

    1. nickjacksonct Avatar

      Hello Suva,

      Thank you for your feedback on my paper. I actually -also- recently did a research paper on the anonymity of Reddit and how it functions as a landscape, so I completely see what you’re saying. There are a lot of parallels! I really appreciate and value that feedback, and I feel as if you’re completely right. Specifically, talking about online disinhibition and its effects on communities would have been an interesting route for me to go down.

      I believe that the community notes feature has its pros and cons. It is frequently used to clarify factual points in content, discussions and posts on the platform. However, due to the fact that it is partially user-reliant, this has potential to cause issues within echo chamber communities. While community notes are overseen by a (now much smaller) staff body, there is the ability for users to give feedback as to whether the note is helpful or not, which often times influences its visibility. With that in place, community notes that make valid corrections and criticisms may be subject to bias. Additionally, community notes often don’t cover content that is subjective but still extreme in nature. For example, somebody expressing a personal dislike of a minority group would not be able to be corrected through community notes, which leaves plenty of room for ideological extremism to grow and exist in these spaces. So while it is a tool that has the potential to help, with its current functionalities and restrictions, it can only do so much.

      1. Suva Pokharel Avatar

        Hey Nick,

        That’s a great point! I didn’t consider that subjective comments can still result in potential harmful ideologies and ideas and as they are, like you mentioned, subjective or “opinions” they may not warrant the use of community notes where misinformation from statements would. You’ve clearly thought about the pros and cons of this system and reflects your amazing research!

        Great work!

  2. jessicawarburton Avatar

    This was a really great run down on the effects of the events that took place and I really enjoyed the part where you broke down the methodologies and struggles in studying the platform. It is a good reminder that these “communities” are just commodities for those wealthy enough to buy them.

    1. nickjacksonct Avatar

      Hi, thank you so much for this feedback!

      That is very much the case, unfortunately. It’s interesting to think about how communities can be commodified in ways completely unique to the online world as opposed to offline communities, which are less confined to a set ‘space’ and thus less susceptible to being transformed by wealth alone.

  3. 20515539 Avatar

    Really interesting paper nickjacksonct. I have to interact way more than I’d like with X and it’s been fascinating witnessing it in real time. My paper was on toxic fan communities and whilst that seems unrelated the overlap with that far right movement is terribly high. I also still feel like there’s a bit of cognitive dissonance on how the media still cross link to X as if it hasn’t had this huge change. Interesting to see how it progresses.

  4. Eva Avatar

    Hi Nick,

    That was really well-written. I think X is such a perfect example of the dangers of an under-moderated space and how it can divulge into hate speech so quickly and can consequently change the entire demographic of a platform. I have assumed that X is losing relevance due to this shift but it is troubling that for the users who remained this could lead to a massive echo-chamber where hate-speech is normalized and extremist discussions are safeguarded and given a platform to advertise from.

    My paper explores the alt-right radicalization pipeline on YouTube and during my research I found YouTube to be a common place for ‘red-pilling’, but it is generally after this that the alt-right seek more unmoderated spaces (typically like 4Chan) to engage with their community and solidify their ideologies. It would be interesting to see how X now fits within that radicalization landscape. I was wondering, while you were researching did you notice any specific tipping point where public trust in the platform visibly collapsed?

    If you’re interested and have got the time, I’d love your thoughts on my paper (https://networkconference.netstudies.org/2025/onsc/6010/youtube-as-a-radicalizing-force-the-promotion-of-the-alt-right-pipeline/)

    Thanks again for the great read!

  5. Lyam Temple Avatar

    Thanks for such a thought-provoking and well-researched paper. I really appreciated your analysis of how shifts in leadership and moderation on X have reshaped the platform’s overall culture, it was sharp and really insightful.

    One thing I kept thinking about as I read was the question of intent. Do you think the rise in extremist content under Musk’s ownership was a deliberate outcome of his choices, or more of an unintended side effect of pushing for a more “free speech” model with fewer restrictions? Basically, was creating space for harmful ideologies part of the plan, or did those views just become more visible once the old guardrails were taken down?

    Personally, I’d like to think it wasn’t his intent. I genuinely believe Musk has done a lot of good in the world, and I see him as a a futurist and a utilitarian, someone acting in what he sees as the best interest of humanity as a whole, even if that doesn’t always align with what benefits the few. The tricky part is, sometimes the voices of the few are louder, and sometimes, for good reason. We absolutely need to protect free speech, but not in a way that’s totally unmonitored or without consequences. People should be free to say what they believe, but if that speech intentionally harms others, there should be accountability.

    I’d love to hear your thoughts on how you see Musk’s intent in this situation, and how we weigh responsibility vs intent, where outcomes can be so stark but the motivations might be more complex.

    Thanks again for sharing such a compelling piece, it definitely gave me a lot to reflect on.

    Lyam