D_Baric_Conference Paper

Abstract

This paper argues that right-wing communities are nurtured through the affordances of the YouTube platform. The audiovisual nature of content hosted on YouTube creates strong relationships between viewers and creators, supporting an influencer culture that has been adopted by right-wing creators, or “political influencers.” The communities that exist to support political influencers are aided by YouTube’s other affordances, including a micro celebrity culture and a recommendation algorithm, which assist like-minded people to come together and experience a sense of belonging. YouTube provides a “third place”, as outlined by sociologist Ray Oldenburg, aiding anti-progressive views to amplify and spread, and potentially wield negative political influence. Through a review of available literature, this paper applies the theory of third place to YouTube’s right-wing communities, to analyse the ongoing adhesion of these groups in the face of criticism and efforts by the platform to steer viewers away from such content. This paper concludes that right-wing viewers utilise YouTube’s features to seek each other out in the safety provided by a third place, where they can find acceptance within engaged communities, and potentially politically influence others through the spread of misinformation. This paper will contribute to the ongoing wider debate centered on YouTube and its potential to radicalize viewers, which tend to focus on the role of YouTube’s recommendation algorithm, towards a deeper understanding of the social aspects of how radicalization may occur on YouTube.

 

YouTube is a social network site that makes use of audiovisual communication more than any other platform. Anyone with an Internet connection can upload self-created content, made simply and cheaply with only a phone. With no gatekeepers and little moderation, a creator can say what they wish, provided YouTube’s policies against offensive language are adhered to. But as is demonstrated on YouTube every day, there are endless ways of creating inflammatory and belligerent content without using actual slurs. There are also many ways that such content can be spread and made popular using the affordances of YouTube. The establishment of micro celebrity culture on the platform has facilitated the creation of thriving right-wing communities that interact and amplify anti-progressive views. YouTube’s algorithm helps users who hold such views to find the content and communities they are seeking and potentially aid the spread of misinformation. In this paper, it will be argued that YouTube’s affordances have helped these communities to thrive. By hosting right-wing extremist content, facilitating a micro celebrity culture, and employing a recommendation algorithm, YouTube creates a “third place” for existing far right communities to share and reinforce their views, and potentially exert political influence.

 

YouTube is the most popular social network among right-wing users (Munger & Phillips, 2022, p. 186). It uses audiovisual communication more than any other social network site, a feature that makes it possible for content creators to connect on an emotional level with their audiences through use of an affective style of politics (Lewis, 2018, p. 20; Ekman, 2014, p. 96). People increasingly obtain their news from user-generated content, which is problematic when anyone with an Internet connection can now produce media, making any claims they wish with no fact-checking required (Munger & Phillips, 2022, p. 190). Mistrust of mainstream media amongst conservative and right-wing groups has been building for decades, so it is unsurprising that right-wing content has the highest number of viewers on YouTube in terms of political content (Finlayson, 2022, p. 67; Freelon et al., 2020, p. 1198). Right-wing content can be distinguished through its glorification of free speech and its opposition to “political correctness”. These videos often show support for racism, white nationalism, anti-feminism and anti-Semitism (Munger & Phillips, 2022, p. 199). Any content that is deemed illegal according to YouTube’s policies is automatically removed from the site, including hate speech, graphic violence and malicious attacks (YouTube, 2022). However, these policies leave much inflammatory content on the platform because creators can circumvent these rules by avoiding explicit use of slurs, but still put forward abhorrent views (Lewis, 2018, p. 43; de Keulenaar et al., 2021, p. 18). YouTube, despite attracting criticism for allowing this content on its site, and its unclear policy on what constitutes hate speech, defends its stance based on support of free speech (Lewis, 2020a; Hokka, 2021; Stokel-Walker, 2019). This stance makes possible the existence of strong right-wing communities.

 

Munger and Phillips state that communities form on YouTube much easier than on other social networks sites, with shared ideas as the main focal point (2022, p. 187). Members of these communities tend to have strong interests in politics but low news literacy, preferring the medium of video for its visual and emotional appeal. They are often white working class men who form parasocial relationships with the creators of the content they view (Munger & Phillips, 2022, p. 198). Channels broadcasting right-wing content tend to have popular comments sections, indicating that there is a high level of interaction and a strong community amongst viewers (Munger & Phillips, 2022, p 208). YouTube allows its users to actively engage and participate with content and each other through affordances such as likes, dislikes, comments, subscribing, sharing, upvoting and downvoting (Finlayson, 2022, p. 68). Stephen Crowder, a popular right-wing YouTube micro celebrity whose channel has 5.62 million subscribers, posts regular videos that attract millions of views, and whose viewers interact with each other by posting comments and replying to each other in the comments section (https://www.youtube.com/c/StevenCrowder). People do not tend to seek out information that contradicts their views or beliefs, and process political information emotionally. They trust material shared by those they know over that found via mainstream media, thereby facilitating the formation of communities of like-minded individuals (Boczkowski et al., 2018, p.3534; Papacharissi & Trevey, 2018, p. 89). Some communities form around a shared belief that they are “social underdogs”, banding together over their rejection of “progressive values” (Lewis, 2018, p.16). The Internet has assisted the formation of communities that would likely not exist offline, bringing together people from all over the world who would otherwise never meet or converse, creating “thin communities” made up of strangers with similar tastes (Delanty, 2018, p.204-215). Communities such as those that have formed around micro celebrities Stephen Crowder and Marcus Follin (https://www.youtube.com/c/TheLatsbrah) are able to exist in the “third place” that YouTube has provided.

 

Oldenburg (1999) devised the term “third place” to name the public spaces people used for informal and casual social interaction, such as cafes, parks, and libraries. This concept has been extended to online interactions, despite Oldenburg’s belief that third places needed to be sites of face-to-face interaction. Other scholars have argued that the accessible and non-hierarchical nature of online interactions can provide an alternative to traditional third places, allowing communication between equals and community building (Argen, 1998, as cited in Soukup, 2006, p. 428). Given the micro celebrity culture that exists among right-wing users of YouTube, it is difficult to argue that these online spaces allow interaction between equals. However, the sense of belonging created in these spaces does constitute a third place, as stated by Soukup (2006, p. 423), and it is in these spaces that commitment to right-wing politics is fostered, with content creators who perform as “political influencers” and make use of YouTube’s micro celebrity culture to nurture communities.

 

Micro celebrity in the online sphere refers to social media users with niche audiences, who have adopted techniques from mainstream celebrity culture to gain a following (Lewis, 2020b, p. 203). Political influencers utilise the participatory affordances of the platform to engage their audience and enhance their micro celebrity in a largely unmoderated environment (Lewis, 2018, p. 1; de Keulenaar et al., 2021, p. 4). Jenkins et al. (2009, p. vi) argued that participatory culture involved some level of mentorship and social connection and portrayed the nature of these interactions as positive, however the same aspects of participatory culture empower creators with undemocratic and anti-progressive agendas (Ekman, 2014, p. 80). The right-wing micro celebrities of YouTube form part of an “alternative influence network” as defined by Lewis (2018) and oppose mainstream media and social justice movements (Lewis, 2020b, p. 204). The videos or “vlogs” they produce often espouse scientific ideas that have long been discredited to justify racism and xenophobia, with the creator telling personal stories as though to a friend to foster intimacy, thereby adopting micro celebrity techniques to demonstrate their relatability, authenticity, and accountability (Lewis, 2020b, p. 201; Lewis, 2020a). Videos are made solo or in collaboration with other well-known right-wing micro celebrities, including guest appearances and debates (de Keulenaar et al., 2021, p. 5). In this way, right-wing messaging is amplified and disseminated amongst various audiences (Lewis, 2020). Marketing strategies are employed to increase the spreadability of the content, including search engine optimization, ideological testimonies, and self-branding (Lewis, 2018, p. 25-30; de Keulenaar et al., 2021, p. 5). Content is monetized via the insertion of advertisements in the same way that other influencers do, but rather than making an income from selling products to their audience, political influencers are selling political ideas (Lewis, 2018, p. 25). They are incentivised by the platform to provide their audience with what they demand, thus the continued existence of right-wing channels is in some part due to YouTube’s structure of community engagement, and the algorithms it employs to bring like-minded people together (Lewis, 2020a). YouTube’s support of micro celebrity culture through the platform’s participatory features and incentivising of creators allows right-wing communities to thrive.

 

The Internet has made it much easier for right-wing groups to find new audiences. Before the Internet, there were few ways to communicate their extreme views to a receptive audience, as these views pushed the boundaries of acceptable normal behaviour (de Keulenaar et al., 2021, p. 4). YouTube provides them with a platform from which to communicate with viewers and create a community through its provision of a third place (Ekman, 2014, p. 80-82). Right-wing groups tend to be marginalised and disjointed, so the ability to find like-minded people online provides a sense of social cohesion and belonging (Ekman, 2014, p. 82). YouTube’s recommendation algorithm aids this discovery, and as such is a source of contention amongst scholars. Many believe that it steers users who tend to watch moderate right-wing content to more extreme far-right channels to “maximise viewer engagement”, constituting a “radicalisation by algorithm” (Freelon et al., 2020, p. 1199; Munger & Phillips, 2022, p. 187). Others have argued that the algorithm does not promote right-wing content, but rather recommends mainstream media before user-created YouTube channels (Ledwich & Zaitsev, 2020). Munger and Phillips (2022) and Ledwich & Zaitsev (2020) have stated that right-wing content on YouTube is subject to supply and demand, meaning that rather than steering viewers towards right-wing content and radicalising them, existing users go to YouTube with pre-existing right-wing views and the algorithms simply help them to find it easily. In response to criticisms it was promoting right-wing content, YouTube altered its algorithm so that it would demote such content rather than promote it (YouTube, 2019). Munger and Phillips (2022, p. 188) claim that viewership of extreme right content has been in decline since 2017, but this has been disputed by de Keulenaar et al., who argue that while viewer numbers are down, the level of engagement with right-wing content has increased (2021, p. 15). This would indicate strong and active communities, with the potential to wield political influence.

 

Social media can facilitate the spread of misleading information and amplify social and political grievances, giving marginalised groups a platform or third place (Zhuravskaya et al., 2020, p. 416). Social media has low barriers to entry and relies heavily on user-generated content, which anyone can upload with no fact-checking and little moderation or impartial editing (Zuk & Zuk, 2021, p. 795). Consumers of far-right content tend to be more engaged than average users, consisting of small yet stable communities, which is still a large number of individuals on a site as large as YouTube (Hosseinmardi et al,. 2021, p. 5). Content is tailored to pre-existing preferences via YouTube’s algorithms, so that users exist within “echo chambers” (Zhuravskaya et al., 2020, p. 417-424). While social media can be used to encourage people to vote and engage in positive forms of political activity, it can also increase perceptions of government corruption, so much so that some studies have shown that incumbents lose the vote in areas where there is broadband penetration and therefore access to political information (Zhuravskaya et al., 2020, p. 419). The more viewers a piece of content has, the greater its potential to exert political influence, and the real danger posed by right-wing content on YouTube is its propensity to create “radical alternative political canons” accompanied by dedicated communities (Munger & Phillips, 2022, p. 6 & 187). YouTube provides its users with an enormous archive of content in one place, with discoverability aided by algorithms that assist viewers to find what they need based on their preferences. This content has the potential to amplify information that is inaccurate and harmful, as questionable content spreads the same way as reputable or fact-checked content, through being shared on social media platforms (Cinelli, et al., 2020). With algorithms that keep users ensconced in their chosen echo chambers, and very little moderation in place to prevent the spread of misinformation, YouTube assists in furthering the marginalisation of right-wing communities.

 

The audiovisual affordances of YouTube make it a unique space for fostering strong relationships between creators and viewers. The micro celebrity culture that has emerged on the platform, and its lack of effective censorship over content that expresses anti-progressive and right-wing ideas, has resulted in highly engaged communities that thrive in the third place that YouTube provides. Political influencers perform authenticity to gain their supporters trust and amplify their right-wing views through growing their followings and fostering the growth of communities. While the radicalization potential of its recommendation algorithm has been debated, YouTube’s discoverability features serve to match users who wish to view right-wing content with what they want to see. Social media features such as comments, subscription and liking encourage interaction and participation, building communities and aiding the spread of misleading information and thereby potentially exerting negative political influence.

 

 

References

 

Boczkowski, P.J., Mitchelstein, E., & Matassi, M. (2018). “News comes across when I’m in a moment of leisure”: Understanding the practices of incidental news consumption on social media. New Media & Society, 20(10), 3523-3539. https://doi.org/10.1177/1461444817750396

 

Cinelli, M., Quattrociocchi, W., Galeazzi, A., Valensise, C. M., Brugnoli, E., Schmidt, A. L.,      Zola, P., Zollo, F., & Scala, A. (2020). The COVID-19 social media     infodemic. Scientific Reports, 10(1), 16598. http://dx.doi.org/10.1038/s41598-    020-73510-5

 

de Keulenaar, E., Tuters, M., Osborne-Carey, C., Jurg, D., & Kisjes, I. (2021). A free market in extreme speech: Scientific racism and bloodsports on YouTube. Digital Scholarship in the Humanities. https://doi.org/10.1093/llc/fqab076

 

Delanty, G. (2018). Community: 3rd edition (3rd ed.). Routledge. https://doi.org/10.4324/9781315158259

 

Ekman, M. (2014). The dark side of online activism: Swedish right-wing extremist video activism on YouTube. MedieKultur: Journal of Media and Communication Research, 30(56), 79-99. https://doi.org/10.7146/mediekultur.v30i56.8967

 

Finlayson, A. (2022). YouTube and political ideologies: Technology, populism and rhetorical form. Political Studies, 70(1), 62-80. https://doi.org/10.1177/0032321720934630

 

Freelon, D., Marwick, A., & Kreiss, D. (2020). False equivalencies: Online activism from left to right. Science, 369(6508), 1197-1201. https://www.science.org/doi/10.1126/science.abb2428

 

Hokka, J. (2020). PewDiePie, racism and YouTube’s neoliberalist interpretation of free speech.

Convergence: The International Journal of Research into New Media Technologies,27(1), 142-160. https://doi.org/10.1177/1354856520938602

 

Hosseinmardi, H., Ghasemian, A., Clauset, A., Mobius, M., Rothschild, D.M., & Watts, D.J. Examining the consumption of radical content on YouTube. Proceedings of the National Academy of Sciences, 118(32), 1-10. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8364190/pdf/pnas.202101967.pdf

 

Jenkins, H., Purushotma, R., Clinton, K., Weigel, M., & Robinson, A. (2009). Confronting the challenges of participatory culture: Media education for the 21st century. MIT Press. https://library.oapen.org/handle/20.500.12657/26083

 

Ledwich, M., & Zaitsev, A. (2020). Algorithmic extremism: Examining YouTube’s rabbit hole of radicalization. First Monday, 25(3). https://firstmonday.org/ojs/index.php/fm/article/download/10419/9404

 

Lewis, R. (2018). Alternative influence: Broadcasting the reactionary right on YouTube. Data & Society, 1-60. https://datasociety.net/library/alternative-influence/

 

Lewis, R. (2020a, December 11). I warned in 2018 YouTube was fueling far-right extremism. Here’s what the platform should be doing. The Guardian. https://www.theguardian.com/technology/2020/dec/11/youtube-islamophobia-christchurch-shooter-hate-speech

 

Lewis, R. (2020b). “This is what the news won’t show you”: YouTube creators and the reactionary politics of micro-celebrity. Television and New Media, 21(2), 201-217. https://doi.org/10.1177/1527476419879919

 

Munger, K., & Phillips, J. (2022). Right-wing YouTube: A supply and demand perspective. The International Journal of Press/Politics, 27(1), 186-219. https://doi.org/10.1177/1940161220964767

 

Oldenburg, R. (1999). The great good place: Cafes, coffee shops, bookstores, bars, hair salons and other hangouts at the heart of a community (2nd ed.). Marlowe

& Company.

 

Papacharissi, Z., & Trevey, M. T. (2018). Affective publics and windows of opportunity:         Social media and the potential for social change. In M. Graham (Ed.), The  Routledge Companion to Media and Activism (1st ed., pp. 87-96). Routledge.               https://doi.org/10.4324/9781315475059

 

Soukup, C. (2006). Computer-mediated communication as a virtual third place: building Oldenburg’s great good places on the world wide web. New Media & Society, 8(3), 421-440. https://doi.org/10.1177/1461444806061953

 

Stokel-Walker, C. (2019, June 6). YouTube’s plan to fix hate speech failed before it even started. Wired. https://www.wired.co.uk/article/youtube-steven-crowder-ban-hate-speech

YouTube. (2019). Our ongoing work to tackle hate. https://blog.youtube/news-and-events/our-ongoing-work-to-tackle-hate/

 

YouTube. (2022). YouTube Help: Policy, safety and copyright. https://support.google.com/youtube/answer/9288567?hl=en#:~:text=Hate%20speech%2C%20predatory%20behavior%2C%20graphic,isn%27t%20allowed%20on%20YouTube.

 

Zhuravskaya, E., Petrova, M., & Enikolopov, R. (2020). Political effects of the Internet and social media. Annual Review of Economics, 12(1), 415-438. https://doi.org/10.1146/annurev-economics-081919-050239

 

Zuk, P., & Zuk, P. (2020). Right-wing populism in Poland and anti-vaccine myths on         YouTube: Political and cultural threats to public health, Global Public Health,     15(6), 790-804. https://doi.org/10.1080/17441692.2020.1718733

22 thoughts on “YouTube and its role in the creation of a “third place” for right-wing communities

  1. Hi Diana,

    Thank you for the insights into how right–wing communities are nurtured through the affordances of social media platform such as YouTube. I really like how you throw platform policies and algorithm into the mix. I really like that line where you discussed “radicalisation by algorithm” (Freekon et al, 2020).

    think algorithm can be misused as an excuse by platforms as they deflect blame and lack of consideration about consequences and human–rights by design. This is particular difficult when it comes to jokes, anonymity and pseudonymity.

    The subject matter of this essay is very interesting to me. It reminds me about the politics unit I did late last year. We looked at how racism can be amplified online, old jokes and ideologies can infest social media by taking on different forms, and how maleness and whiteness can be reclaimed as like–minded users find each other and agree with each other. Gamergate is an example that comes to mind, which becomes a playbook for online harassment. I looked at how Leslie Jones has been hit by racism and sexism at the same time, and Twitter basically did nothing until it was too late.

    I find your findings around mistrust of mainstream media by conservative and right–wing groups very interesting. How do you the future will look like as we see more conservative media surface on the internet? You mentioned Steven Crowder (who sort of branded himself as a “comedian”) or Alex Jones (although he recently just filed bankruptcy, but the idea won’t go away anytime soon) or Joe Rogan (another “comedian”).

    Thank you for picking such a topical subject.

    Cheers
    Mags

    Not to be hard sell, but here is my paper. https://networkconference.netstudies.org/2022/csm/294/indigenous-memes-by-indigenous-hands-how-internet-memes-become-an-important-storytelling-medium-used-by-indigenous-peoples/

    • Diana Baric says:

      Hi Mags

      Thanks for taking the time to read my paper and share your thoughts.

      I agree that algorithms get too much attention, from the platforms themselves and even from some academics writing about radicalisation. The idea that people are mindless automatons who get led down the right-wing rabbit hole like children is a bit ridiculous, and really not the point. As Rebecca Lewis pointed out, it’s not that people get led to this content, it’s that it’s there in the first place, up on YouTube and other platforms, uncensored and un-factchecked. Platforms have a greater responsibility to their audience than to just allow material that incites hate to sit there and be viewed millions of times.

      Regarding your question about the future of right-wing material: from what I read while researching this paper, it seems that the numbers of people viewing with this material is shrinking, but that the level of interaction between existing viewers is on the rise. It seems the highest numbers of viewers watching right-wing material on YouTube occurred around 2017, coinciding with the Trump era. I think committed right-wing communities will always be around, but how loud and noisy they are is related to what is happening in the world in general. Trump’s own rhetoric made a lot of these right-wing views (anti-immigration, sexism, racism) suddenly ok to express, whereas they were usually banished to extreme fringes of society. Now that he’s gone (and hopefully will never return), the numbers of people “committed to the cause” of extreme right-wing views has possibly dwindled without his rallying of them. But will another similar global figure emerge and grow the numbers again? There’s nothing to prevent it.

      I will certainly check out your paper, thanks for the link 🙂

      • Hi Diana, thanks for your reply.

        I forgot to mention.. speaking of YouTube, platform and propaganda, I read this article back when doing the politics unit. It talks about a publishing company – they create this 5 Min ‘how to” videos, very popular on YouTube. And this article is eye opening, because beneath all the “fun” videos, there is an agenda. https://www.lawfareblog.com/biggest-social-media-operation-youve-never-heard-run-out-cyprus-russians

        A bit side track but just wanted to share the link.

        Cheers
        Mags

      • Genevieve Dobson says:

        Great read and insights Diana! Thank you. Gosh, I hope we don’t see another Trump era, although I have heard since being banned from Twitter he has created his own social media platform, ironically called “Truth Social”. I’m wondering what your’s and Mag’s thoughts are on the debate in which individuals or groups get banned from one platform and consequently move to a lesser known and lesser governed platforms, where their views can be fostered and supported even further. Are we better to allow “free speech” on the more visible platforms so there is an opportunity to dispute the misinformation or radicalism more widenly?

        • Diana Baric says:

          Hi Gen

          Thanks for reading! Yes I had read somewhere that Trump was looking to make his own site but like Goldilocks he wanted one that was ‘just right’ – ‘Truth Social’, you just have to laugh don’t you?!

          It is a concern that right-wing influencers booted off mainstream sites will turn to these less popular but even less regulated platforms. From my research, academics think that their hardcore followers will follow them, but perhaps others won’t because they may not know as many people on the lesser known platform, and would prefer to stay on YouTube. There’s no doubt that the echo chamber will get more insulated on smaller platforms, but hopefully these can be regulated in the same ways as major platforms. It’s a Catch-22: if they get more popular, they attract more attention, and may find they don’t pass under the radar quite so easily anymore. Here’s hoping.

          • Genevieve Dobson says:

            Definitely, here’s hoping! Maybe it’s up to the rest of us to ensure those that are not completely radicalised also have other networks and communities on YouTube and other platforms that make them too good to leave. Safe, friendly and supportive online communities to hopefully drown out the radical and extreme.

            The challenge will always remain that a minority group, with strong and diverse opinions, will often seem to have the loudest voice. We saw that with Covid-19 and anti-lockdown and anti-vax protests. However, in the meantime the rest of us, the majority, were quietly getting on with things, protecting the community, getting vaccinated, still just trying to live our best lives.

  2. Brooke Birch says:

    Hi Diana,

    This was such an interesting read! While writing my paper, I didn’t fully consider how the affordances of social media platforms can often do more harm than good. I think our papers contrast really well in this way, but also build on the same principles. I found your reference to Ekman (2014) particularly fascinating whereby you explain how participatory culture inadvertently has the ability to “empower creators with undemocratic and anti-progressive agenda”. I had not fully considered this before so found it very enlightening! I completely agree that YouTube has a responsibility towards their users regarding the hosting of misinformation on their platform. As you say, while they may not actively promote these videos anymore, they are very easily found and shareable. I’d love to know if you came across any solutions to this problem during your research? It seems hard to conceptualise such with freedom of speech being such an important aspect of YouTube’s platform.

    Thanks 😊
    Brooke

  3. Diana Baric says:

    Thanks for reading my paper Brooke, and for your insightful comments. Regarding your question about possible solutions, a great article by Stokel-Walker (https://www.wired.co.uk/article/youtube-steven-crowder-ban-hate-speech) outlines the heart of the problem, which is that YouTube doesn’t want to take a firm stance on determining what constitutes hate speech and what doesn’t, mainly because it would upset a lot of its viewers. As stated in the article: ‘the EU has been pushing them that if something is hateful or racist they have to act, or they get fines. Article 17 [of the new EU Copyright regulation] is also forcing it into no longer pretending to be a blind platform.’ So YouTube has taken some action and taken down a lot of its videos and suspended a lot of accounts, but critics argue there is still a lot more they can do, but I can’t see them doing it without external pressure, such as that exerted by the EU, or any other regulatory body that has jurisdiction over YouTube. As you said, freedom of speech is important and it’s everyone’s right, but there is a lot of misinformation in the opinions these influencers are expressing which needs to be appropriately responded to. The platform makes money from this type of content, so it needs to be held to account.

  4. Harry Wallace says:

    Hello Diana,
    Very interesting paper, I enjoyed reading it. I was aware that right-wing communities were present on YouTube but not to this extent. A few paragraphs resonated with me which I agree with, people do tend to buy into their beliefs even if it defies logic;
    “People do not tend to seek out information that contradicts their views or beliefs, and process political information emotionally”
    “people increasingly obtain their news from user-generated content, which is problematic when anyone with an Internet connection can now produce media, making any claims they wish with no fact-checking required”

    My understanding of the current situation of YouTube’s issues with these communities was that it had been rectified or subdued, but from reading your paper and doing a bit of research that is not correct at all. An article by Roose, K. cited
    “A European research group, VOX-Pol, conducted a separate analysis of nearly 30,000 Twitter accounts affiliated with the alt-right. It found that the accounts linked to YouTube more often than to any other site.” – Roose, K. (2019). The making of a YouTube radical. The New York Times, 8

    Which gives evidence too your points, I had also found from some sources a lot of these right-wing communities have migrated to Bittube, which is even more of a haven for them, have you heard of this? I also wanted to ask about your point that one of the main requirements from preventing content like this being removed is cursing, if videos are manually reported by other users does YouTube take any action on them?

    • Diana Baric says:

      Hi Harry

      Thanks for taking the time to read my paper, and I’m really glad you got something out of it.

      In answer to your question, yes I have heard of Bittube, and there is also Rumble, discussed here in this article from the Seattle Times: https://www.seattletimes.com/business/rumble-a-youtube-rival-popular-with-conservatives-will-pay-creators-who-challenge-the-status-quo/
      It is of course extremely likely that if kicked off of YouTube, these creators will migrate to these lesser known sites with even less checks and balances than YouTube has, and continue to spread their misinformation. What can be done about this? The good news, as spelled out in this article on The Conversation (https://theconversation.com/meet-rumble-canadas-new-free-speech-platform-and-its-impact-on-the-fight-against-online-misinformation-163343) is that Rumble is a Canadian platform, so it is subject to publishing laws surrounding harmful content. The Canadian Government can therefore do something about it, as long as there is sufficient political will. I think all of these platforms are not as free to act as they’d like to think they are, they are still subject to law, so it will take political agitation to hopefully make some difference.

      As for reporting content, yes, viewers can report videos as this YouTube help page describes: https://support.google.com/youtube/answer/2802027?hl=en&co=GENIE.Platform%3DDesktop The content isn’t automatically taken down, rather it will be viewed by YouTube moderators and if it breaches YouTube policies, it will get taken down.

      • Harry Wallace says:

        Thanks Diana that was insightful and answered all my questions! I wonder if we will ever get a to a place where this kind of content will be subdued away from easily being consumed by the public, forced to only be on perhaps the dark web. I assume one of the next dilemmas would be free speech and the freedom to exercise it but that is difficult topic. Thanks again for answering my questions!

        • Diana Baric says:

          You’re welcome Harry, thanks again for asking them!
          Yes, what you say about free speech is an interesting point. While free speech is everyone’s right, the flipside is there are consequences to free speech, ie that people will argue with you, be offended, or that you might get kicked off of YouTube for saying things that breach their codes. You won’t get locked up, but what you say, if it is misleading or incorrect, will get you in trouble. Will these kinds of communities go completely underground, to the dark web, as you suggested? Maybe, but sadly for the time being I think these sorts of groups are so emboldened by the likes of Trump and Fox News, we won’t be seeing them disappear from mainstream channels like YouTube any time soon.

          • Harry Wallace says:

            Yes I agree, I just viewed some content that had interviewers going to rallies with trump supporters and other right wing idealists and it was scary how delusional some people were. These communities are so strong and back one another they seemingly could be convinced of any conspiracy theory and preach it to the world. I also do not believe this will be going away anytime soon unfortunately.

            Thanks again for your replies Diana, great paper.

  5. Saara Ismail says:

    Hi Diana, this was such an interesting read. My paper focuses on a large amount of benefits of influencers and influencer culture on social media, although I did not look into any negative effects of social media. I like how specific this paper is focusing on the singular platform of youtube, which makes the paper so much more interesting! I agree and like how you engage with the concept of algorithms and how they further generate more communities, particularly right winged communities, through the use of Youtube.

    • Diana Baric says:

      Hi Saara

      Thanks for reading. I’ve really enjoyed the papers I’ve read about influencer culture during the conference, yours included, that have discussed the positive and negative outcomes of this aspect of social media. It really is shades of gray – influencer culture is both good and bad, depending on what it is used for. There is a lot of trust placed in these micro celebrities; so much depends on what motivates them as to whether what they do online will be positive or negative.

  6. Ken Lyons says:

    Hi Dee,

    Your paper is informative and interesting. Well done! I was surprised by the fact YouTube is the most popular social networking platform among right-wing users (Munger & Phillips, 2022, p. 186) – in my own head I thought it would have been Twitter, especially given all the hype around Trump using Twitter as his main platform until he was removed. I agree that extremism has always been present – both left and right-wing – and both can be damaging. Extreme right-wing activism is particularly dangerous and abhorrent though. There should be no place in society for many of the views expressed in those channels. I think it’s simply a fact of life now, with improved technology, that those types of views will be amplified globally. There is a real echo-chamber effect happening – people hear/see the type of material they seek out and then the algorithms feed that further. No matter what we’re looking for, either in earnest or casually, the algorithms always seem to be able to find more and more information that is relevant. Inevitably, we get fed more and more of what we seek (and already believe?) and less of the opposing views.

    I think many of the extremists that exist on social media in general, and YouTube in particular, could be considered charismatic. They are able to grab people’s attention and hold onto it. They are very effective at delivering their messages in ways that make them believable. They themselves often come across as likeable and relatable. They are the ‘everyday man or woman’ just ‘saying what everyone believes’ – they’re playing on the old marketing adage of ‘know, like and trust’ to gain followers, which, of course, further feeds the algorithms. Being able to see people on their YouTube channel makes them more relatable – “they look just like me” so they must be trustworthy! They are able to harass and bully certain sectors of the community, often with impunity, because of the sometimes subtle nature of their attacks. There is a fine line between free speech and attack, which seems to be more often than not, crossed.

    Perhaps, at least initially, people seek out others with like-minded beliefs and as they watch the results of their searches, they become more entwined in the echo-chamber that develops. They become a part of the conversation, chatting with others in the comments section. Soon, it seems that the person in the video is speaking directly to them, reinforcing their beliefs. The more they watch and listen, the less counter views they will be shown and eventually it seems that theirs is the only true position on the topic – it’s the position the majority hold because it’s what ‘everyone is saying’. In reality, it’s a minority, but it seems to be validated because they don’t get to see the counter-position.

    Algorithms are there to help us. But are they there to help YouTube as well? The cynical view could be that the more relevant content YouTube shows the consumer, the more they will watch, which means there will be more ad revenue. That ad revenue is good for the content creator and will encourage them to make even more content, but it’s also good for YouTube.

    So, in the context of community, is YouTube a third place where people gather? Perhaps it is – as you discovered, Oldenberg (1999) coined the term ‘third place’ but also believes that it needs to be a physical place where people can meet to commune. Many other scholars, on the other hand, have evolved that definition to include online virtual third places. So yes, I believe YouTube is in this instance a community. It’s a virtual third place where people can gather and communicate with one another about things that interest or concern them. They might be considered ‘thin’ in so much as people come and go with ease, because they’re not tied down geographically, and they might not have the same intimate knowledge about other members of the community, but they still meet over common interests.

    There have always been people that will feed other people’s fears. Sadly, those people have now moved online and are able to reach more people more easily.

    Cheers,
    Ken

    • Diana Baric says:

      Thanks for reading Ken, and thank you for your considered comments, you summarise the main points brilliantly!

      Charisma is definitely a tool used by influencers in general, and certainly works a treat for these political influencers. By seeming ‘reasonable’ and ‘rational’, they appeal to their audiences, who no doubt already share their views or are open to them, and amplify these views.

      I feel that the third place applies to this space largely because these views are often abhorrent to a wider community, so like-minded people seek out those who share their views for camaraderie and a form of companionship, as do all communities who come together online looking to share something. It may be the case that friends and family don’t share these views, or they might be a loner. And as you say, it’s so easy to reach people online, regardless of what you are looking for.

      • Ken Lyons says:

        Hi Dee,

        I think many people try to find solace with others that have the same beliefs as themselves. Perhaps they find that in the ‘real world’ their views aren’t as well tolerated, but when they move online it’s much easier to find people that think the same as they do. This, I think, is where the echo-chamber effect really comes into play. People hear similar views to their own, without counter argument, and they begin to believe that ‘the masses’ all have the same view as them, simply because everyone they are discussing the matter with does. They soon come to believe that those around them with opposing views are in fact the minority – a twisted truth at best.

        We live in exciting and interesting times and I am excited for the future, even though some of it is going to be negative. There is always going to be good and bad in the world, as there always has been. I’m sure the benefits of social media and online virtual third spaces will be positive overall and I look forward to seeing what the future brings.

        Cheers,
        Ken

  7. Benjamin Scott says:

    Hi Diana,

    This was a really well written paper on a very interesting topic. I think that online echo chambers are a huge issue for free thinking, and your paper is definitely an illustration of how this works on YouTube. People don’t seek out content online that they don’t agree with, and in some cases this leads them down the rabbit hole of radicalisation. The real question is what is how will this be combated going forward. I believe that the YouTube algorithm in particular is very good at achieving what it is supposed to, and I can’t see it changing too much in terms of its recommendation system. It’s also difficult to draw the line at what should and shouldn’t be promoted on social media sites. Misinformation is a huge issue online, and the future is going to be quite interesting to see how it affects the real world in the short term and the long run.

    Thanks,
    Ben

    • Diana Baric says:

      Hi Benjamin

      Thanks for your insightful comments. I agree that people will continue to seek information that confirm their biases, regardless of what side of the political fence they sit. I can only hope that ongoing criticisms of the major platforms (and the minor ones too!) might aid the worst of the misinformation and hate speech being taken down. Tougher policies by the platforms themselves, and tougher regulations by governments and other political bodies, would help too.

  8. Wilmer Wong Wan Po says:

    Hi Diana,

    Thanks for Sharing.

    The reading was very insightful. I would never have imagine that right-wing content would propagate as much on YouTube, though I did run into a few videos by right-wing groups. I think much of the content is unregulated on YouTube because these groups disseminate the information in a subtle way that would otherwise, be divisive or incendiary. As you said ‘The videos or “vlogs” they produce often espouse scientific ideas that have long been discredited to justify racism and xenophobia, with the creator telling personal stories as though to a friend to foster intimacy, thereby adopting micro celebrity techniques to demonstrate their relatability, authenticity, and accountability.’ Therefore, right-wing ideologies have been able to blend into YouTube by promulgating less explicit content in the videos to reach out to the audiences and grab their attention.

    Regards,
    Wilmer

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>