Refracted Publics on Twitter: Anti-Vaxxers’ Role in Increasing Global Vaccine Hesitancy.

Posted on

by


Abstract:

The online anti-vax community has utilised the Twitter platform as a means of disseminating false and misleading information regarding the efficacy of vaccines. By acting as a refracted public, this anti-vax community has engaged in some insidious strategies and tactics that has resulted in a significant increase in vaccine hesitancy within the public.


Download PDF


Social media platforms, such as Twitter, have created space for many online communities to flourish, which has resulted in positive relationships forming between users. However, it has also allowed a specific subset of like-minded users, commonly referred to as anti-vaxxers, to connect with others and slowly grow their online following. This online community of anti-vaxxers function as a refracted public on Twitter, gathering and disseminating misinformation regarding vaccines through clever strategies designed to capture the attention of a specific audience. This online anti-vax activity exploded in the wake of the SARS-CoV-2 (COVID-19) pandemic, and despite well-intentioned attempts by Twitter to curb the spread of vaccine-related misinformation, it caused a global rise in vaccine hesitancy.

 

Online Communities

The rise of information and communication technologies has created a vast number of “technologically-mediated communities” (Delanty, 2018, p. 201), that may no longer have a solid, spatial place, but instead are “more fluid and temporary forms of social relations sustained only by processes of communication outside of which they have no reality” (Delanty, 2018, p. 201). Communities are no longer only in-person gatherings of individuals with strong ties to each other (Hampton, 2016), instead they have been transformed and expanded by communication technology; for example, social media platforms (Hampton, 2016). These social media platforms, such as Twitter, have given users the ability to share their thoughts without much consequence – it has created opportunities for people from different walks of life, who have similar thoughts and opinions, to meet in virtual spaces (Benoit & Mauldin, 2021). This can be a negative thing as much as it can be a positive, as it allows space for extremist communities to band together (Kata, 2012).

 

The Online Anti-Vax Community

The online anti-vax community are an extremist group of like-minded individuals that are opposed to vaccines – they do not believe they work, deem them unsafe and often distort or reject the science behind them (Benoit & Mauldin, 2021; Khadafi et al., 2022). Delanty (2018) asserts that like-minded individuals who discover each other online, tend to create groups that “affirm one’s prejudices” (Parsell, 2008, as cited in Delanty, 2018, p. 219), and previous studies have found that anti-vaxxers are resolute in their beliefs and not open to accepting the possibility of being wrong (Kata, 2012; Mitra et al., 2016). The anti-vax community online uses Twitter to amplify their views and opinions (Maci, 2019) and “disseminate messages, facts and beliefs” (Betsch et al., 2012, as cited in Benoit & Mauldin, 2021, para. 4) that are against forms of vaccination. Delanty (2018) explains that “the sharing of information” (p. 205) is a key objective of virtual communities, which aligns with the anti-vaxxer’s ultimate aim to share and spread misinformation.

This virtual community of anti-vaxxers has grown exponentially online (Benoit & Mauldin, 2021; Blane et al., 2022; Muric et al., 2021), particularly in response to the COVID-19 pandemic (Armitage, 2021; Blane et al., 2022; Muric et al., 2021; Nasralah et al., 2022). Their online activities have had such an effect on their audience, across multiple social media platforms, that the World Health Organization has listed vaccine hesitancy as one of the top ten threats to global health (Armitage, 2021; Astrup, 2019; Khadafi et al., 2022; Nasralah et al., 2022).

 

So, What is Vaccine Hesitancy?

Vaccine hesitancy is a reluctance in the recommendation to vaccinate (Astrup, 2019), either oneself or one’s children. It has a negative effect on society as a whole, as it endangers the protection that herd immunity brings (Alvarez-Zuzek et al., 2022). Muric et al. (2021) mentions a multitude of studies that show vaccine hesitant individuals tend to be much more easily influenced by misinformation than pro-vaccine individuals, and more obviously swayed by emotional appeals (Nguyen & Catalan-Matamoros, 2022). The prevalence of anti-vax misinformation online has lowered vaccine trust (Bradshaw, 2022) and in addition, Astrup (2019) found that apprehensions regarding vaccinations can have an effect on others in their social circle (Alvarez-Zuzek et al., 2022).

 

Refracted Publics

In order to spread their opinions and beliefs on Twitter, anti-vaxxers have devolved into operating as a refracted public. Refracted publics “are shaped by circumvention” (Abidin, 2021, para. 8), they assist in allowing communities to post and spread content that can be extremely harmful to other users (Abidin, 2021). They engage in a delicate balancing of their own online visibility in order to “redirect audience interest” (Abidin, 2021, para. 1) to where it is most needed. This dissemination of anti-vaxxers’ views and beliefs on Twitter, has caused an increased level of “information distrust” (Abidin, 2021, para. 9), leading to a rising scepticism in the safety of vaccines. This distrust was exacerbated in part by an infodemic triggered by COVID-19. Infodemics on social media occur when there is “perpetual content saturation” (Abidin, 2021, para. 9); platforms are flooded with a constant stream of endless information, including false information, to the point that consumers become overwhelmed and unable to completely absorb it all (Abidin, 2021; Skafle et al., 2022). This continuous stream of information, and subsequent scepticism of it, are further examples of the way the anti-vax community online is operating as a refracted public (Abidin, 2021) on Twitter. By taking advantage of this phenomenon, anti-vaxxer’s have created a higher level of vaccine hesitancy by amplifying their anti-vaccination messages through insidious strategies, such as clickbait tactics, hashtag jacking and social steganography (Abidin, 2021).

 

Clickbait Tactics

Anti-vaxxers use conspiracy theories to create a negative narrative surrounding vaccines (Ginossar et al. 2022; Nasralah et al., 2022), particularly by enticing their audience into clicking additional links that lead to harmful vaccine misinformation. This is a strategy known as clickbait; anti-vaxxers’ use conspiracy theories as a controversial subject that grabs attention and increases the interest of their audience (Abidin, 2021), particularly from the vaccine hesitant crowd. One such conspiracy theory that gained traction during the COVID-19 pandemic on Twitter, is that the vaccine was created to harm the general public, instead of protecting it (Kata, 2012; McCarthy et al. 2022). Kata (2012) describes this tactic as a “toxin gambit” (p. 3783) that is designed to scare individuals about particular ingredients in vaccines. Anti-vaxxers shared this controversial misinformation on Twitter through links that connected to websites, that then led to harmful misinformation on the ingredients of vaccines (Low, 2021; Malisch, 2022). Since vaccine hesitant individuals tend to have a higher lack of trust in vaccines to begin with, they are therefore more inclined to believe such conspiracy theories, resulting in a decrease in the intention to get vaccinated or to vaccinate their children (Ginossar et al. 2022; McCarthy et al., 2022).

 

Hashtag Jacking

Anti-vaxxers take advantage of a feature on Twitter known as the ‘hashtag’. This is a word or series of words that follow the pound symbol (#); these keyword/s then become hyperlinks that enable users to find related subjects (Khadafi et al., 2022). In addition to this, if enough users tag the same thing, these topics can become trending, which will then increase its visibility on the platform and potentially affect public opinion (Khadafi et al., 2022). Anti-vaxxers on Twitter have engaged in insidious hashtag jacking, a strategy that Abidin (2021) describes as a way for users to latch onto a particular hashtag and inundate a platform, with the aim being to “mock, satirize or negatively critique” (Gilkerson & Berg, 2018, as cited in Abidin, 2021) it.

 

#DoctorsSpeakUp

The hashtag #DoctorsSpeakUp was originally created by a doctor intent on spreading a science-backed, pro-vaccination message on Twitter (Bradshaw, 2022). However, the anti-vax community hijacked this message and twisted it to fit their own narrative, turning the hashtag into “questions such as, ‘When will #DoctorsSpeakUp that vaccines harm and kill children?’” (Morris, 2022, as cited in Bradshaw, 2022, para. 1). Bradshaw (2022) found that 71% of users who tweeted #DoctorsSpeakUp were from anti-vaxxers flooding the Twitter platform with tweets on topics such as vaccine injury and uninformed doctors. Hoffman et al. (2021) found that anti-vax tweeters participating in the #DoctorsSpeakUp hashtag were more likely to use personal narratives and anecdotes. This plays on the emotions of their audience, which can often be the catalyst for vaccine hesitant individuals to reject vaccines (Nguyen & Catalan-Matamoros, 2022). The switch from doctors promoting the COVID-19 vaccine to anti-vaxxers condemning it was very effective in changing the narrative from an informed (science-backed) pro-vaccination message to an anti-vaccination message, veiled in fear and distrust. Furthermore, since vaccine hesitant individuals are much more likely to believe a negative change in COVID-19 vaccine information (Sharevski et al., 2022), it also had the effect of generating additional distrust in it (Bradshaw, 2022). Anti-vaxxers on Twitter use hashtags as a “weapon” (Khadafi et al., 2022) and by repeatedly using this insidious hashtag jacking strategy, the anti-vax community sows a deeper vaccine distrust and lures the vaccine hesitant further onto their side.

 

Social Steganography

A common anti-vax expression, particularly throughout the COVID-19 pandemic and beyond, is “I’m not anti-vaccine, I’m just pro-safety” (DanielRollTide, 2021; Tuijnman, 2020; Victory, 2023). The anti-vax community uses this statement as a form of dog-whistle, garnering support for their movement – who can argue with being pro-safety? ­– while simultaneously revealing their allegiance to the anti-vax community by then providing misinformation and seeding doubt regarding the efficacy of vaccines (Kata, 2012). Dog-whistling is a strategy of refracted publics known as social steganography and it is a way to implant a different kind of meaning into a seemingly innocuous message (Abidin, 2021). An example of this can be seen with Dr. Bob Sears, an American doctor and author known for his anti-vaccination/pro-safety stance, including delaying vaccines for children (Offit & Moser, 2009). Throughout the COVID-19 pandemic, Sears’ dog-whistles were shared on many platforms, including Twitter (kath2cats, 2020). Given the fact Sears is a doctor, therefore a figure of authority, creating doubt surrounding the COVID-19 vaccine has a greater negative effect on the vaccine hesitant and a potentially damaging effect on their children (Offit & Moser, 2009; Gorski, 2015). These dog-whistling tactics work particularly well on the vaccine hesitant individual; viewing blatant anti-vaccine information that appears in a balanced/safe way invokes their fears surrounding it and can result in an inflated awareness of potential vaccine risks and side effects (Kata, 2012; Skafle et al. 2022).

 

Twitter’s Role

Despite these damaging strategies used by the anti-vax community on Twitter, the platform has attempted to do its part in eliminating the amount of misinformation flowing on it (Burki, 2020). By placing warning tags on posts, limiting, removing and even banning users that post misleading and incorrect vaccine related information, the platform has greatly reduced and restricted the prevalence of anti-vaccine content online (Blane et al., 2022). However, it has not completely eliminated pockets of anti-vax rhetoric from remaining active on Twitter (R.FreedomWaves, 2023; TheChiefNerd, 2023) and since anti-vaxxers tend to be resolute in their beliefs, it likely never will (Armitage, 2021). While Twitter’s misinformation policies have been a great step in halting the viral spread of misinformation, it has not been a complete fix – some individuals still remember, and even believe, posts that have been labelled as such (Burki, 2020). Furthermore, censoring and shutting the anti-vax community down completely may cause vaccine hesitant individuals with valid concerns or questions to “sympathise with anti-vax rhetoric” (Armitage, 2021, para. 3). Indeed, Skafle et al. (2022) found that the vaccine hesitant disregarded platform warning tags if the message “aligned with their personal beliefs” (para. 24). Therefore, health professionals and governments need to focus on being far more transparent in their online messaging (Armitage, 2021; Ginossar et al., 2022; Nasralah et al., 2022). This can be done by building on “e-Health literacy” (Skafle et al., 2022, para. 37), so that vaccine hesitant individuals can accurately deconstruct vaccine misinformation; post more frequently; and act faster in addressing anti-vax claims (Burki, 2020).

 

Anti-vaxxers have devolved into a refracted public on Twitter by utilising sinister strategies such as clickbait tactics, hashtag jacking and social steganography. They specifically target vaccine hesitant individuals and destroy their faith in, and understanding of, vaccines. To its credit, Twitter has attempted to do its part in helping eliminate the extensive reach of the anti-vax community on its platform. By implementing the use of warning tags on posts and even banning blatant offenders, it has managed to greatly reduce the amount of anti-vax content posted. However, it is clear that more needs to be done in order to properly address the concerns of the vaccine hesitant, before they are swayed by the strategies and tactics of this refracted public on Twitter.

 

 

Reference List

Abidin, C. (2021). From “networked publics” to “refracted publics”: A companion framework for researching “below the radar” studies. Social Media + Society, 7(1), 1-13. https://doi.org/10.1177/2056305120984458

Alvarez-Zuzek, L.G., Zipfel, C.M., & Bansal, S. (2022). Spatial clustering in vaccination hesitancy: The role of social influence and social selection. PLoS Computational Biology, 18(10), Article e1010437. https://doi.org/10.1371/journal.pcbi.1010437

Armitage, R. (2021). Online ‘anti-vax’ campaigns and COVID-19: Censorship is not the solution. Public Health, 190, 29-30. https://doi.org/10.1016/j.puhe.2020.12.005

Astrup, J. E. (2019). Catching the anti-vax bug. Community Practitioner, 92(5), 14-17. https://www.proquest.com/scholarly-journals/catching-anti-vax-bug/docview/2233941007/se-2

Benoit, S. L., & Mauldin, R. F. (2021). The “anti-vax” movement: A quantitative report on vaccine beliefs and knowledge across social media. BMC Public Health, 21(1), 2106. https://doi.org/10.1186/s12889-020-12114-8

Blane, J. T., Bellutta, D., & Carley, K. M. (2022). Social cyber maneuvers during the COVID-19 vaccine initial rollout: Content analysis of tweets. Journal of Medical Internet Research, 24(3). https://doi.org/10.2196/34040

Bradshaw, A. S. (2022). #DoctorsSpeakUp: Exploration of hashtag hijacking by anti-vaccine advocates and the influence of scientific counterparts on Twitter. Health Communication, 1-11. https://doi.org/10.1080/10410236.2022.2058159

Burki, T. (2020). The online anti-vaccine movement in the age of COVID-19. The Lancet Digital Health, 2(10), 504-505. https://doi.org/10.1016/s2589-7500(20)30227-2

DanielRollTide. (2021, Dec 17). CDC warns against J&J Vaccine following deaths. Where is the Data on Pfizer and Moderna? Sealed I’m not Anti Vax [Image attached] [Tweet]. Twitter. https://twitter.com/DanielRollTide/status/1471596822653243401

Delanty, G. (2018). Community: 3rd edition (3rd ed.). Routledge. https://doi.org/10.4324/9781315158259

Ginossar, T., Cruickshank, I. J., Zheleva, E., Sulskis, J., & Berger-Wolf, T. (2022). Cross-platform spread: Vaccine-related content, sources, and conspiracy theories in YouTube videos shared in early Twitter COVID-19 conversations. Human Vaccines & Immunotherapeutics, 18(1), 1-13. https://doi.org/10.1080/21645515.2021.2003647

Gorski, D. (2015, Jan 23). “Dr. Bob” Sears: Perfecting the art of the antivaccine dog whistle. Science Blogs. https://scienceblogs.com/insolence/2015/01/23/dr-bob-sears-perfecting-the-art-of-the-antivaccine-dog-whistle

Hampton, K. N. (2016). Persistent and pervasive community: New communication technologies and the future of community. American Behavioral Scientist, 60(1), 101-124. https://doi.org/10.1177/0002764215601714

Hoffman, B. L., Colditz, J. B., Shensa, A., Wolynn, R., Taneja, S. B., Felter, E. M., Wolynn, T., & Sidani, J. E. (2021). #DoctorsSpeakUp: Lessons learned from a pro-vaccine Twitter event. Vaccine, 39(19), 2684-2691. https://doi.org/10.1016/j.vaccine.2021.03.061

Kata, A. (2012). Anti-vaccine activists, web 2.0, and the postmodern paradigm: An overview of tactics and tropes used online by the anti-vaccination movement. Vaccine, 30(25), 3778-3789. https://doi.org/10.1016/j.vaccine.2011.11.112

kath2cats. (2020, Mar 30). So Dr. Bob Sears is still at it, I see… spreading misinformation and selfishness. [Image attached] [Tweet]. Twitter. https://twitter.com/kath2cats/status/1244347100227198978

Khadafi, R., Nurmandi, A., Qodir, Z., & Misran. (2022). Hashtag as a new weapon to resist the COVID-19 vaccination policy: A qualitative study of the anti-vaccine movement in Brazil, USA, and Indonesia. Human Vaccines & Immunotherapeutics, 18(1), Article 2042135. https://doi.org/10.1080/21645515.2022.2042135

Low, S. [@ShawnLo94790920]. (2021, Oct 2). Check out the many toxic ingredients in the Covid-19 experimental vaccines. [Link attached] [Tweet]. Twitter. https://twitter.com/ShawnLo94790920/status/1444327946794135552

Maci, S. (2019). Discourse strategies of fake news in the anti-vax campaign. Languages Cultures Mediation, 6(1), 15-43. https://doi.org/10.7358/LCM-2019-001-MACI

Malisch, A. [@MalischAnni]. (2022, Sept 4]. Toxic, Metallic Compounds Found in All #COVID #Vaccine Samples Analyzed by German Scientists “A group of independent German scientists found [Link attached] [Tweet]. Twitter. https://twitter.com/MalischAnni/status/1566237592814764033

McCarthy, M., Murphy, K., Sargeant, E., & Williamson, H. (2022). Examining the relationship between conspiracy theories and COVID-19 vaccine hesitancy: A mediating role for perceived health threats, trust, and anomie? Analyses of Social Issues and Public Policy, 22(1), 106-129. https://doi.org/10.1111/asap.12291

Mitra, T., Counts, S., & Pennebaker, J. W. (2016). Understanding anti-vaccination attitudes in social media. Proceedings of the International AAAI Conference on Web and Social Media, 10(1), 269-278. https://doi.org/10.1609/icwsm.v10i1.14729

Muric, G., Wu, Y., & Ferrara, E. (2021). COVID-19 vaccine hesitancy on social media: Building a public twitter data set of antivaccine content, vaccine misinformation and conspiracies. JMIR Public Health and Surveillance, 7(11), Article 30642. https://doi.org/10.2196/30642

Nasralah, T., Elnoshokaty, A., El-Gayar, O., Al-Ramahi, M., & Wahbeh, A. (2022). A comparative analysis of anti-vax discourse on Twitter before and after COVID-19 onset. Health Informatics Journal, 28(4). https://doi.org/10.1177/14604582221135831

Nguyen, A., & Catalan-Matamoros, D. (2022). Anti-vaccine discourse on social media: An exploratory audit of negative tweets about vaccines and their posters. Vaccines, 10(12), Article 2067. https://doi.org/10.3390/vaccines/10122067

Offit, P.A., & Moser, C. A. (2009). The problem with Dr Bob’s alternative vaccine schedule. Pediatrics, 123(1), 164-169. https://doi.org/10.1542/peds.2008-2189

R.FreedomWaves. (2023, Apr 1). Autopsy Studies of COVID-19 Illness Rule Out Extensive Myocarditis…. More proof that the vaccines are the killers!! [Link attached] [Tweet]. Twitter. https://twitter.com/RFreedomwaves/status/1642161616656498688

Sharevski, F., Huff, A., Jachim, P., & Pieroni, E. (2022). (Mis)perceptions and engagement on Twitter: COVID-19 vaccine rumors on efficacy and mass immunization effort. International Journal of Information Management Data Insights, 2(1), Article 100059. https://doi.org/10.1016/j.jjimei.2022.100059

Skafle, I., Nordahl-Hansen, A., Quintana, D. S., Wynn, R., & Gabarron, E. (2022). Misinformation about COVID-19 vaccines on social media: Rapid review. Journal of Medical Internet Research, 24(8), Article e37367. https://doi.org/10.2196/37367

TheChiefNerd. (2023, Apr 1). “Ten percent (10%) say a member of their household has died whose death they think may have been caused by [Image attached] [Tweet]. Twitter. https://twitter.com/TheChiefNerd/status/1642163882239172609

Tuijnman, D. [@DanielTuijnman]. (2020, Jul 15). Jenny McCarthy: “We’re not an anti-vaccine movement. We’re pro-safe-vaccine schedule.” [Link attached] [Tweet]. Twitter. https://twitter.com/DanielTuijnman/status/1283392477689450501

Victory, K. [@DrKellyVictory]. (2023, Jan 10). The whole “anti-vaccine” attack is old and tired. As you know, I’m not “anti-“ anything; I am “pro” safety, “pro” [Tweet]. Twitter. https://twitter.com/DrKellyVictory/status/1612516912252203008

 

 

 

 

 

 

 

 

 


Search Site

Your Experience

We would love to hear about your experience at our conference this year via our DCN XIV Feedback Form.

Comments

23 responses to “Refracted Publics on Twitter: Anti-Vaxxers’ Role in Increasing Global Vaccine Hesitancy.”

  1. Stephen.B.Bain Avatar
    Stephen.B.Bain

    Hi Charlotte,

    This online-conference of ours is great isn’t it! I particularly am enjoying the opportunity to read multiple papers on the same topics such as the ‘Twitter-Vax’ affairs.

    Tonight I thought I’d sit down after dinner, reread Abidin (2021), and then read a few papers on refracted publics. In doing so, I’ve had the opportunity to reflect on refracted publics and ponder if they are a community that is on the ‘same-page’ or are they in opposition to the state for widely disparate reasons, only coming together to hate the common-enemy, with in-their-minds the end justifying the means – in other words do they truly believe everything is a conspiracy or do they merely jump on the bandwagon in order to attack the ‘management’ because they don’t like taxes/regulation/enforcement/lockdowns/being-told etc.

    I’m thinking that there’s a requirement for sinister-intent to be an element of a refracted public.

    As studies of “below the radar” cultures develop (Abidin, 2021), and based on your investigation into the recent example(s); what do you think may be, if any, some of the additional criteria that could be assessed for consideration as viable additions to the checklist of the behaviours and motivators of a refracted public.

    Kind regards
    Steve

    Reference:
    Abidin, C. (2021). From “networked publics” to “refracted publics”: A companion framework for researching “below the radar” studies. Social Media + Society, 7(1), 1-13. https://doi.org/10.1177/2056305120984458

    1. Charlotte Phillips Avatar
      Charlotte Phillips

      Hi Steve,

      Thanks so much for reading my paper and for your insightful commentary! I quite agree, the conference is a fantastic way of consolidating all these different ideas around social media, communities and networks.

      The Abidin paper is quite possibly my favourite from our list of essential readings. It’s quite complex to wrap one’s head around at first, but I found it extremely interesting! I don’t think you’re necessarily wrong, there are certainly individuals out there who choose to ‘jump on the bandwagon’ and use any old excuse to launch their crusade against xyz. Abidin (2021) states that “refracted publics are a product of the landscape” (para. 1) – in other words, the pandemic/fake news/infodemic environment on these platforms online really laid down the foundation for the anti-vax community to devolve into a refracted public. My research has also shown that they are a community that work together quite effectively – the way they banded together by engaging in hashtag jacking in order to sow doubt about the vaccine’s efficacy among the public, for example.

      You raise a really thought-provoking question here and one I’m not sure I can effectively answer – Abidin is quite thorough in her assessment of what makes cultures and communities refracted publics online. Although as social media continues evolving, the strategies of these refracted publics will have no choice but to do the same. I wonder if a better approach would be to consider the ways in which we can educate the public to better recognise and dismiss the more sinister workings of refracted publics?

      Food for thought, indeed!

      Charlotte.

      Ref:
      Abidin, C. (2021). From “networked publics” to “refracted publics”: A companion framework for researching “below the radar” studies. Social Media + Society, 7(1), 1-13. https://doi.org/10.1177/2056305120984458

  2. Iesha Roberts Avatar
    Iesha Roberts

    Hi Charlotte,
    I really enjoyed this! It was a fantastic read, and I liked how well-referenced and well researched it was, particularly hashtag jacking and the example of #DoctorsSpeakUp.

    Social Steganography was not a term I’d heard of before, but when you explained it, it was easy and interesting to understand. And very insidious and familiar – whether it be on social media or something more televised, it seems like a popular tactic to get what you want/the votes you want/the reaction you want, while not necessarily spewing outright lies, but definitely not full truths. Abidin (2021) mentions that it’s quite like encoding messages that can be read with one interpretation for one audience, and a differently for another – I think that’s very apt, and particularly evident in your paper with the example of Dr. Bob Sears.

    I agree that Twitter needs to do more, and certainly, they’re trying with the messages beneath tweets. But, after the recent reforms of Twitter due to a change in management, it’s becoming even more difficult to tell the truth from the lies. There’s ‘Twitter Gold’ for organisations in beta [1] but even then, it’s still a subscription-based service. Do you think that these changes will negatively affect combating misinformation? It feels a little like one step forward, two steps back.

    Cheers!

    [1] https://help.twitter.com/en/using-twitter/twitter-blue#tbverified-orgs

    1. Charlotte Phillips Avatar
      Charlotte Phillips

      Thanks Iesha!

      Abidin’s reading really educated me on the different strategies and tactics that refracted publics use – many of which I hadn’t heard of before, either! It’s a bit scary how prevalent dog-whistling really is… It’s actually super common in the political sphere, for the exact reasons you mention (getting the votes)! One of the things I came across in my research, which I didn’t add as it wasn’t relevant to Twitter specifically, was our then Prime Minister (Scott Morrison) dog-whistling support for anti-vax protests at a speech he made during the pandemic (Duckett, 2022).

      You bring up a fantastic point regarding misinformation on the platform now that the management has changed. The fact that anyone can “buy” a blue verified tick is already a massive blow to e-health posters as it gives anti-vaxxers licence to make their posts seem more credible (Trang, 2022). In addition, Musk has now abandoned the policies Twitter enforced in 2020, which will undoubtedly undo all the good it managed previously. Definitely a ‘1 step forward, 2 steps back’ type of feeling, I have to agree!

      Some great discussion points!

      Charlotte.

      Ref:
      Duckett, S. (2022). Public health management of the COVID-19 pandemic in Australia: The role of the Morrison government. National Library of Medicine, 19(16), Article 10400. https://doi.org/10.3390/ijerph191610400

      Trang, B. (2022, Nov 1). How Musk’s Twitter takeover could impact misinformation. STAT News. https://www.statnews.com/2022/11/01/how-musks-twitter-takeover-could-impact-misinformation/

  3. Ning Choi Avatar
    Ning Choi

    Hi Charlotte!

    This was a brilliant read on how easily the spread of misinformation can be across social media – especially on Twitter. Coming from the football world, I am familiar with both the concept of Clickbait Tactics and Social Steganography (although I didn’t realise that this was the term for it) with how sly the news outlets are revolving around the sport.

    For me what I found particularly interesting was the formation of the anti-vax community, especially in how their views seem to lean on the extremist side and were similar to the concept of “Ultras,” or extremist football fans in my paper.

    In saying that, what more do you think can be done by Twitter to combat the spread of misinformation? There was talk about having verification i.e. attaching someone’s name and contact information for all users of a platform like Twitter when there was rampant racism after the EUROS in the football space. Do you think this could prevent individuals in spreading clear misinformation?

    Ning

    1. Charlotte Phillips Avatar
      Charlotte Phillips

      Hi Ning,

      Thanks so much! Yes, they are definitely strategies that can be used across all sorts of different subject matters. As I mentioned to Iesha, politics is a hot topic for it as well!

      Absolutely, and I think pre-Musk takeover this could have been an option. Health departments and government posters on Twitter nearly always had verified ticks next to their names, this is something that helped other users understand that they were credible sources to be trusted (Trang, 2022). The tick back then had a strict identification process to complete before one could acquire it (name, contact info etc). However, with the takeover by Musk, the monetisation of the blue tick (anyone can buy it now), and the subsequent abandoning of pandemic misinformation policies, I feel Twitter has gone right back to square one again.

      Charlotte.

      Ref:
      Trang, B. (2022, Nov 1). How Musk’s Twitter takeover could impact misinformation. STAT News. https://www.statnews.com/2022/11/01/how-musks-twitter-takeover-could-impact-misinformation/

  4. Avinash Assonne Avatar
    Avinash Assonne

    Hello Charlotte,

    Your paper was an absorbing read :). It really shows how fake information dissemination can be dangerous/misleading specially when we were still in the midst of the pandemic Covid-19. Although this issue has always been around, it really became even more dangerous and detrimental during the pandemic where many people were already emotionally weak or affected. During the Covid-19 outbreak, misinformation and “fake news” about the pandemic bombed all over the news and on social media which provoked speculative fears among many netizens.

    You probably already heard about some of them, but some examples of fake news in relation to the origin of Covid-19 includes the consumption of ‘bat soup’ as the main cause of infection. Another example of such misleading news includes many conspiracy theories like how the USA concocted the virus to undermine the Chinese economy. The misinformation also comprises of made-up cures such as consumption of marjoram (an aromatic plant), cleansing with salt water, drinking bleach and consuming herbal tea. Such false news promotes negative assumptions, conflict and racist thoughts while affecting the public’s perceptions undesirably.

    I have heard of Anti-Vaxxers and their discourse on social media but did not really get into it. Thus, your paper and the research that you did regarding the matter was very interesting and it really encouraged me to delve deeper into the subject. Vaccine hesitancy has been a huge issue but not only that, the aggressive behavior of anti-vaxxers and their toxicity specially on Twitter is just really disappointing. I can honestly say that I don’t support their movement at all. Those anti-vaxxers actually make me think of those aggressive vegan activists. Don’t you think that there are similarities specially regarding the provocative and aggressive tactics that they use on social media?

    1. Charlotte Phillips Avatar
      Charlotte Phillips

      Hi Avinash,

      Thank you for taking the time to read my paper! I’m glad it could shed some light on a subject you hadn’t got into before.

      Yes, you are correct in your analysis of the wide range of clickbait tactics in the form of conspiracy theories that anti-vaxxers used (and continue to use, to be honest) online. I definitely think that there are many other communities out there that act as refracted publics, vegan extremists included. During my research on hashtag jacking, I actually came across a vegan example – #Februdairy. The hashtag originally started as a way to highlight the dairy industry and the benefits associated (Williams, 2019). Instead, it was hijacked by vegans promoting the suffering of animals involved and its environmental impact. In addition (and as mentioned on another comment above), my research has also shown that social steganography is quite a common tactic used in the political sphere via the dog-whistle.

      With regards to your second comment, I think Twitter is in a worse off place than it started, to be honest! With the platform no longer enforcing the COVID-19 misinformation policies, I can’t help but think all their previous good work in attempting to shut down extreme anti-vax rhetoric during the pandemic will be all for naught. Especially when you consider the fact that the pandemic technically isn’t over yet!

      Thanks for your thoughts!

      Charlotte.

      Ref:
      Williams, D. K. (2019). Hostile Hashtag Takeover: An Analysis of the Battle for Februdairy. M/C Journal, 22(2). https://doi.org/10.5204/mcj.1503

      1. Stephen.B.Bain Avatar
        Stephen.B.Bain

        Hi Charlotte,

        Diving right in at the deep end. Your mention of vegans tag-jacking, for me links across to your enquiry via my paper about management of imagery with respect to conservation issues (such as the culling of lionfish).

        “You mention these images need to be critically assessed before publication, how do you think this could be efficiently managed on these networked publics…” (Phillips, 2023)

        I think that the efficient management of images relies largely on the proactive application of lessons learned from past anti-activism (such as the vegan-dairygate) that you mention in discussion on your paper.

        I’ve seen it in other spheres, where there is an unspoken culture that almost silently is activated to address issues … hopefully the environmental-managers and citizen-scientists have such an understanding (I think they do) … the lionfish case-study presents as a golden-egg laying goose, where spearfishos can be hunters who are the good-guys => it’s a space that they don’t wish to be damaged from the inside!

        I accept that this comment is a bit cliche-esque; however it’s really where it’s at when role-playing campaigns in-order to second-guess the opposition and implement risk-mitigation.

        btw: Congratulations on the amount of discussion that your paper is generating.

        ps: I don’t have an issue with vegans, … however some of them may have a problem with me (or rather they have problems/issues with the management techniques that I endorse)

        1. Charlotte Phillips Avatar
          Charlotte Phillips

          Hi Steve,

          Absolutely, you’re right. We can learn important lessons from various other anti-activisms. Your answer regarding unspoken cultures that address issues is an interesting one! It just takes one misguided individual though, doesn’t it?

          Thanks for the brain food!

          Charlotte.

  5. Avinash Assonne Avatar
    Avinash Assonne

    Also, do you think Twitter is better equipped now to deal with such issue or is it worst? Specially since Elon Musk took over!

    Regards,
    Avinash

  6. caesar.al-samarrie Avatar

    Hi Charlotte,

    This paper was a well-researched and insightful read. This article highlighted the tactics that refracted publics use on Twitter. I especially liked when you mentioned that anti-vaxers tend to be resolute in their beliefs. I have seen this numerous times over the past few years. It is incredible how much influence this community has on social media. Thank you for highlighting the strategies and misconceptions regarding COVID-19 vaccines.

    Thanks,

    Caesar

    1. Charlotte Phillips Avatar
      Charlotte Phillips

      Hi Caesar,

      Thanks for taking the time to read my paper!

      I agree, it is quite mind-blowing the way anti-vaxxers prey on the hesitant and twist the narrative to suit their agenda! Social media has really enabled them to target much larger audiences.

      The number of studies and amount of research based on anti-vaxxers is pretty incredible as well – we clearly need an overhaul of existing health education and to incorporate a more thorough informational campaign (targeting the general public) regarding vaccines.

      Kind regards,
      Charlotte.

  7. Sarah.Bailey Avatar
    Sarah.Bailey

    Hi Charlotte,

    This was a great topic and a really interesting read!

    I was particularly interested in your discussion of Twitter’s response to the spread of anti-vax rhetoric on the platform since my own conference paper also dealt with the spread of extremist misinformation (albeit on Reddit instead of Twitter). Much of the pushback to the extremism on Reddit that I studied was done by other users (e.g. responding to misinformative posts with factual information), although my research found that this only served to strengthen users’ convictions in their initial viewpoints. I wonder, did you find any evidence that being provided with legitimate vaccine information/viewing Twitter’s warnings on content had a counterintuitive effect that inadvertently encouraged users further into their anti-vax stances?

    I can imagine, if you were an anti-vaxxer, receiving constant platform-imposed warnings about misinformation on your Tweets would inspire a negative and deeply frustrated reaction (and possibly play into the anti-vax mentality that mainstream media is trying to force harmful vaccines onto the public by suppressing the “truth”). In line with this, do you think there is any way to meaningfully dissuade people from extreme viewpoints like this when they are so far down radical pipelines that they refuse to engage with factual information in any capacity?

    Can’t wait to hear your thoughts!
    Sarah

    1. Charlotte Phillips Avatar
      Charlotte Phillips

      Hi Sarah,

      Thanks for reading my paper and for your insightful comments and questions! I’ll be sure to give your paper a read as well.

      Yes, I absolutely did find evidence of that counterintuitive effect! Burki (2020) found that even with posts labelled or tagged as misleading/false, they can still be read, remembered and even believed. It risks making anti-vaxxers martyrs, reinforces conspiracy theories and heightens health authority mistrust (Armitage, 2021). It can effectively make it seem like their arguments have merit.. “speak truth to power” and all that! Interesting that this was the same case on Reddit, although, perhaps not surprising!

      I honestly don’t think there is much anyone can do once someone’s thinking has become that radical and extreme. The solution, instead, lies in educating and informing those that are vaccine hesitant BEFORE the strategies and tactics of anti-vaxxers lure them onto their side. As mentioned in my paper, we can do this through quicker, thorough rebuttals (Burki, 2020) and by building on e-Health (and science) literacy (Skafle et al., 2022). We definitely need to improve the general public’s knowledge and understanding of HOW vaccines work.

      Warm regards,
      Charlotte.

      Ref:
      Armitage, R. (2021). Online ‘anti-vax’ campaigns and COVID-19: Censorship is not the solution. Public Health, 190, 29-30. https://doi.org/10.1016/j.puhe.2020.12.005

      Burki, T. (2020). The online anti-vaccine movement in the age of COVID-19. The Lancet Digital Health, 2(10), 504-505. https://doi.org/10.1016/s2589-7500(20)30227-2

      Skafle, I., Nordahl-Hansen, A., Quintana, D. S., Wynn, R., & Gabarron, E. (2022). Misinformation about COVID-19 vaccines on social media: Rapid review. Journal of Medical Internet Research, 24(8), Article e37367. https://doi.org/10.2196/37367

      1. Sarah.Bailey Avatar
        Sarah.Bailey

        Hi Charlotte,

        Thank you for your response! It provided me even more insight into the similarities between how factual information often serves to reinforce misinformed belief systems–it sure is a paradox!

        You’re right, the most proactive way to shut down extremism is to reach individuals whilst they’re still on the cusp of the ideology, and not all the way at the bottom of a pipeline. This makes me wonder, though, if everyone has the capacity to find themselves deeply embroiled in an extreme worldview (i.e., as an anti-vaxxer as in your paper, or an incel as in mine). Do you think some people are naturally predisposed to be more susceptible to these types of thinking/self-identification, or do you think, given the right environment, anybody could find themselves with a radical worldview?

        Looking forward to hearing your thoughts! 🙂
        Sarah

        1. Charlotte Phillips Avatar
          Charlotte Phillips

          Hi Sarah,

          A paradox, for sure!

          I think both your questions are true. Some people definitely are more predisposed to radical thinking – everyone has different histories and different baggage to carry and I think this absolutely can affect one’s predisposition to this type of thinking (and this goes hand-in-hand with your paper on incels, as well).

          The COVID-19 pandemic created the perfect environment for anti-vaxxers to take centre stage. People were afraid, worried, emotional in general. Which is the perfect recipe for building distrust and inflating risks/side effects (Kata, 2012; Skafle et al., 2022).

          Having said that, I also discovered that science literacy would be hugely beneficial in preventing the general public from blindly believing viral misinformation on vaccines (Landrum and Olshansky, 2019). Though nothing is perfect and there will always be pockets of extremist individuals, it really comes down to adequate education, in my opinion!

          Great discussions, thanks so much!
          Charlotte.

          Ref:
          Kata, A. (2012). Anti-vaccine activists, web 2.0, and the postmodern paradigm: An overview of tactics and tropes used online by the anti-vaccination movement. Vaccine, 30(25), 3778-3789. https://doi.org/10.1016/j.vaccine.2011.11.112

          Landrum, A. R., & Olshansky, A. (2019). The role of conspiracy mentality in denial of science and susceptibility to viral deception about science. Politics and the Life Sciences, 38(2), 193-209. https://doi.org/10.1017/pls.2019.9

          Skafle, I., Nordahl-Hansen, A., Quintana, D. S., Wynn, R., & Gabarron, E. (2022). Misinformation about COVID-19 vaccines on social media: Rapid review. Journal of Medical Internet Research, 24(8), Article e37367. https://doi.org/10.2196/37367

          1. Sarah.Bailey Avatar
            Sarah.Bailey

            Hi Charlotte,

            Thank you for your reply, again! 🙂

            You’re right, in times of stress people are far more likely to turn to extremism. I wonder what the psychology is behind that–what is it about being put in a situation, like the pandemic, that makes people who would otherwise steer clear from extremist ways of thinking decide to give it a chance? Is it just the fact that the misinformation started spreading at the same time as the facts, so no one had a foundation for what was real and what was radical yet? Or, were people looking for something to blame? Something more tangible to protest and push back against than a virus that would not respond to protestation either way?

            I’d love to hear your thoughts!
            Sarah

  8. Charlotte Phillips Avatar
    Charlotte Phillips

    Hi again, Sarah!

    I think a lot of it came down to the unknown and the very real fear that brought out in people. Then, of course, people just fed off that fear! The viral spread of misinformation certainly didn’t help, and I agree that some people just wanted something to blame. Take all the conspiracy theories surrounding the vaccine, for example – Marinthe et al. (2020) found that they tend to develop “in threatening moments of crisis” (p. 958), which I think is quite an apt description of the pandemic!

    I do think the pandemic and its effects on human behaviour is going to give researcher’s study material for a long time to come!

    Thanks again,
    Charlotte.

    Ref:
    Marinthe, G., Brown, G., Delouvee, S., & Jolley, D. (2020). Looking out for myself: Exploring the relationship between conspiracy mentality, perceived personal risk, and COVID-19 prevention measures. British Journal of Health Psychology, 25(4), 957-980. https://doi.org/10.1111/bjhp.12449

    1. Sarah.Bailey Avatar
      Sarah.Bailey

      Hi Charlotte,

      I agree. I think paranoia in times of stress leads people to accept/believe things they wouldn’t otherwise. The pandemic certainly provided a change of pace (what an understatement!) that provided many facets for research and study!

      Thank you so much for your thorough and in-depth responses to my questions! I think we’ve had a really fantastic discussion over these few weeks! 🙂

      Sarah

      1. Charlotte Phillips Avatar
        Charlotte Phillips

        Hi Sarah,

        Absolutely – thank you so much for an engaging discussion!

        All the best,
        Charlotte.

  9. Tien Avatar
    Tien

    Hi Charlotte,

    Your paper is so interesting. I think that social media platforms such as Twitter really spread a lot of misinformation, especially because it’s so accessible and enables users to be pulled into the algorithm – they’ll be surrounded by so many conspiracy theories that it would seem true! I want to know whether Twitter uses other online networks or platforms to draw more users in. Is Twitter the base of all these conspiracy theories, and create branches to, let’s say, videos on YouTube to tell users more misinformation and theories? Either way, it’s mind-blowing how anti-vaxxers are willing to do almost anything to get people to think their theories are right, going as far as to click-bait the vulnerable!

    1. Charlotte Phillips Avatar
      Charlotte Phillips

      Hi Tien,

      Thank you so much for reading my paper!

      Yes, the algorithms are definitely a tricky one and easily accessible means misinformation is also easily spread (Benoit & Mauldin, 2021) – a conundrum! During my research phase, I found that a lot of anti-vaxxers on Twitter linked to outside sources. One example is an American website ‘Learn The Risk’, it claimed that vaccines are the cause of a significant number of deaths in children (Muric et al., 2021). Another is a website called ‘Vaccine Impact’, which is a website that frequently displays antivaccination propaganda and promotes (false) alternative medicine (Muric et al., 2021). There were definitely shares to other online networks/platforms as well, though I’m not sure I could confidently say that just one specific one was the source of them. Conspiracy theories tend to begin in small radical sections of communities across all social media platforms and then influencers and the media get a hold of the ‘story’ and spread it further (Pertwee et al., 2022). You are absolutely right though, it is mind-blowing the way the anti-vax community has infiltrated the general public and had such an effect on the vaccine hesitant.

      Thanks again!
      Charlotte.

      Ref:
      Benoit, S. L., & Mauldin, R. F. (2021). The “anti-vax” movement: A quantitative report on vaccine beliefs and knowledge across social media. BMC Public Health, 21(1), 2106. https://doi.org/10.1186/s12889-020-12114-8

      Muric, G., Wu, Y., & Ferrara, E. (2021). COVID-19 vaccine hesitancy on social media: Building a public twitter data set of antivaccine content, vaccine misinformation and conspiracies. JMIR Public Health and Surveillance, 7(11), Article 30642. https://doi.org/10.2196/30642

      Pertwee, E., Simas, C., Larson., H. (2022). An epidemic of uncertainty: rumors, conspiracy theories and vaccine hesitancy. Nature Medicine, 28, 456-459. https://doi.org/10.1038/s41591-022-01728-z

Skip to content