Dangers of social media misinformation – Chantal Deutrom
ABSTRACT
This paper identifies the anti-vaccination movement as a social media community. It analyses COVID-19 misinformation spread by the anti-vaccination movement on Facebook and Twitter and the social consequences of that misinformation. It argues that the online anti-vaccination movement may have impact on attitudes and social behaviours about the COVID-19 vaccine. Finally, it outlines what measures are in place to combat COVID-19 misinformation on social media and suggests more rigid policing of misinformation on social media. This paper is analysed under the scope of Hampton and Wellman’s (2018) persistent contact and pervasive awareness concepts.
INTRODUCTION
While social media enables communication across distances, it also enables COVID-19 misinformation to spread (Cinelli et al., 2020). In February 2020, the World Health Organisation (WHO) proclaimed misinformation about COVID-19 an infodemic (Nguyen & Catalan-Matamoros, 2020). Infodemic is defined as “too much information including false or misleading information in digital and physical environments during a disease outbreak” (WHO, n.d., para. 1). Studies in this paper show that the anti-vaccination movement may be responsible for COVID-19 misinformation spread on social media. This paper will analyse the anti-vaccination movement as a community and their relationship with social media. It will look at their community, how they spread misinformation, the social behaviours that result from COVID-19 misinformation on social media and how entities are combatting misinformation on social media. This paper argues that the anti-vaccination community potentially incites COVID-19 vaccine hesitancy and influences offline social behaviours by using Facebook and Twitter to spread COVID-19 misinformation.
THE ONLINE ANTI-VACCINATION COMMUNITY
Research has suggested that the anti-vaccination movement is responsible for misinformation spread on social media (Germani et al., 2021; Wilson & Wiysonge, 2020; Puri et al., 2020). The movement originated offline and has grown over the decades, a growth expedited by the publication of a 1998 study which stated vaccines caused autism (Germani et al., 2021). The anti-vaccination movement has used social media platforms to grow their community. According to Germani et al. (2021), the anti-vaccination movement can credit their “success” to “a strong sense of community” based on common interests, personal beliefs and emotional language (para. 1). This sense of community, in virtual times, can be partially credited to the internet and its ability to enable those with commonalities and preconceived notions to find each other, thus validating their opinions (Delanty, 2018).
Hampton & Wellman (2018) state that the “persistent contact and pervasive awareness” that social media provides changes the shape of communities, while Delanty (2018) states virtual communities exist in the same way traditional communities do because communication is “the essential feature of belonging” (p. 201). Social media allows the anti-vaccination movement to have constant access to others, providing them insight to their thoughts and pursuits. Clearly, social media platforms have changed the anti-vaccination community because it changes their behaviours, particularly with new and easier means of spreading misinformation and maintaining contact. Changes in community can be credited to the prevalence of new technologies which ultimately adjusts how communities conduct themselves (Hampton & Wellman, 2018). While social media has changed how the anti-vaccination community operates, their virtual community has not diminished their traditional community. Social media communities continue to merge with offline communities (Hampton & Wellman, 2018).
Additionally, Ray Oldenburg’s theory of the third place (Oldenburg & Brisset, 1982) may apply to social media platforms. A third place, a place outside of home (first place) and work (second place), for the anti-vaccination movement is social media platforms. Social media platforms provide the anti-vaccination movement with a ‘place’ to swap insights and connect. With Delanty’s notion of virtual community in mind, the anti-vaccination movement could be considered an online community because it is on social media platforms that they communicate and therefore belong. Germani et al. (2021) states it is shared commonalities between anti-vaccination supporters that engenders a strong anti-vaccination community. The anti-vaccination virtual community has likely expanded the anti-vaccination movement’s strength and reach.
THE MAIN SOURCES OF COVID-19 MISINFORMATION
During COVID-19, misinformation came from multiple sources (Puri et al., 2020), the most notable and largest driver being former United States (US) President Donald Trump (Germani et al., 2021). This could possibly be due to his verified Twitter account. Posts from verified Twitter and official Facebook accounts are more likely to be retweeted and reshared (Yang et al., 2020). In 2020, Trump had comments deleted by Facebook and hidden by Twitter for suggesting that COVID-19 was less lethal than influenza (Spring, 2020). This caused anti-mask and pro-Trump Facebook groups to post that COVID-19 was not legitimate (Spring, 2020). A study by Yang et al. (2020) shows that many anti-vaccination Facebook and Twitter accounts were found to be right-wing and affiliated with the US. This supports findings by Germani et al. (2021) that the main “anti-vaccination influencers” (para. 14), apart from Donald Trump, are linked to Donald Trump. They include Donald Trump Jr, Charlie Kirk and other pro-Trump supporters (Germani et al., 2021). Interestingly, anti-vaccination sentiment was also found to be more prominent in individuals who live in areas with greater wealth and access to education (Wilson & Wiysonge, 2020). A study by Jaminson et al. (2020) found that, between December 2018 and February 2019, 145 out of 500 Facebook advertisements had anti-vaccination themes, and 54% of anti-vaccination advertisements originated from two sources (the World Mercury Project and Stop Mandatory Vaccinations). Sometimes anti-vaccination social media accounts, potentially run by the one body, synchronize to “increase influence and evade detection” (Yang et al., 2020, page 10). On Twitter, anti-vaccination supporters tweeted less than pro-vaccination supporters but anti-vaccination supporters were more likely to interact with others in replies and retweets, acting as echo chambers (Germani et al., 2021). Social media algorithms also generate echo chambers, further corroborating anti-vaccination sentiments, providing confirmation bias and limiting exposure to different sources of information (Gisondi et al., 2022). Most COVID-19 misinformation appears to be coming from sources with commonalities in political learnings, modes of influence, social status and beliefs. These commonalities, such as right-leaning, wealthy, well-educated, US-based and social media-verified could potentially be used as key identifiers of the main participants in the online anti-vaccination community. One key feature of the anti-vaccination community is the validation of COVID-19 misinformation, such as echo chambers and post sharing. This potentially makes anti-vaccination misinformation more prevalent on social media.
FACECBOOK AND TWITTER ENABLES THE SPREAD OF COVID-19 MISINFORMATION
Social media has enabled the anti-vaccination community to share misinformation with larger audiences. Anti-vaccinators prefer platforms like Facebook and Twitter for this reason (Germani et al., 2021). In the early days of COVID-19, Facebook and Twitter ruled over traditional media with low credibility posts about COVID-19 (Gisondi et al., 2022). In a study by Yang et al. (2020), Twitter was found to have a larger percentage of low credibility information compared to Facebook, 32% and 21% respectively. Yang et al. (2020) claims the majority of Facebook and Twitter accounts that had top low credibility sources were considered official (65.2% and 71.4% respectively) and responsible for being the “top spreaders” of misinformation on both platforms (78.3% on Facebook and 76.2% on Twitter). In 2020, tweets about COVID-19 were posted every 45 milliseconds and #coronavirus was the second most popular hashtag (Puri et al., 2020). Facebook and Twitter has allowed the anti-vaccination community to project COVID-19 misinformation inside and outside the bounds of their community. Social media is not as heavily regulated for misinformation as traditional media therefore the anti-vaccination community uses it to spread misinformation more widely, swiftly and with ease.
Algorithms, incidental news and misinformation sharing on social media potentially creates a misinformed audience and can influence their behaviour. According to Cinelli et al. (2020), misinformation on social media has been found to “spread faster and wider than fact-based news” (page 1). This misinformation can also be spread rapidly due to the sharing affordances of the internet and the lack of fact-checking and editorial oversight on social media platforms (Gisondi et al., 2022; Puri et al., 2020). The algorithms that social media platforms use may also increase misinformation spread on social media feeds because they cater to social media users’ likes, views and interests (Cinelli et al., 2020; Germani et al., 2021). Additionally, some people find information via their social media feeds through incidental news information (Boczkowski et al., 2018) but this news may not always be accurate. Some users do not seek accurate news sources outside of those platforms; however, seeing the news on social media may incite other users to seek news from accurate news sources (Boczkowski et al., 2018). Other research has found the “emotional appeal” of misinformation contributes to its success and spread, in comparison to pro-vaccination posts which usually have more scientific and fact-based tones (Germani et al., 2021; Nguyen & Catalan-Matamoros, 2020). The presentation and tone of COVID-19 misinformation on social media aids in its prevalence. Social media affordances can increase users’ exposure to misinformation and ultimately impact their decision making.
COVID-19 MISINFORMATION INCITES OFFLINE SOCIAL BEHAVIOURS
COVID-19 misinformation on social media potentially incites vaccine hesitancy. Research shows that this is likely linked to the anti-vaccination movement (Puri et al., 2020). The exposure to misinformation has been found to have great impact on how some people behave and the decisions they make, particularly how they respond to government advice about COVID-19 public health and increases in vaccine hesitancy (Cinelli et al., 2020; Puri et al., 2020). Other research predicted less people would be willing to receive vaccines because they believed them to be dangerous due to the widespread presence of “foreign disinformation operations” on social media (Wilson & Wiysonge, 2020). Disinformation, unlike misinformation, is deliberately deceptive but has also been statistically proven to impact vaccination rates (Wilson & Wiysonge, 2020). COVID-19 misinformation on social media may lessen public credence in the efficacy and development of COVID-19 vaccines and promote misinformation about negative side effects of vaccines, thus causing vaccine hesitancy (Puri et al., 2020). For example, a false story claimed that a participant in the Oxford University COVID-19 vaccine clinical trial passed away because of the vaccine (Megget, 2020). While untrue, this misinformation potentially impacted vaccine hesitancy (Megget, 2020). According to Puri et al. (2020), information about the influenza vaccine on Facebook and Twitter was found to increase vaccine uptake. However, when information about adverse effects of the influenza vaccine was included in a social network simulation model, vaccine-preventable illnesses increased (Puri et al., 2020). This shows that negative information about vaccines on social media can cause vaccine hesitancy. Vaccine hesitancy can lead to social consequences such as impacting herd immunity and the prevention of further COVID-19 outbreaks (Yang et al., 2021). The anti-vaccination community has a clear impact on how people behave. Without COVID-19 misinformation created and spread by them on social media, it is likely that less people would be made aware of anti-vaccination sentiment that can influence vaccine hesitancy. Vaccine hesitancy is not the only social impact of COVID-19 misinformation on social media.
COVID-19 misinformation on social media can influence social behaviour offline. This behaviour may be harmful (Nguyen & Catalan-Matamoros). In fact, the WHO claims an infodemic “causes confusion and risk-taking behaviours that can harm health”, invokes distrust in health authorities and erodes public health responses (WHO, n.d., para.1). Social behaviours correlated with COVID-19 misinformation on social media includes vaccine reluctance, disregard of public health mandates, misuse of drugs, and anti-social behaviours (Cinelli et al., 2020; Wilson & Wiysonge, 2020; Gisondi et al., 2022; Tuccori et al., 2020; Nguyen & Catalan-Matamoros, 2020). COVID-19 misinformation on social media has been identified as having a direct link to decreases in mean vaccination rates and vaccine uptake (Cinelli et al., 2020; Wilson & Wiysonge, 2020). In fact, COVID-19 misinformation has resulted in individuals refusing to accept COVID-19 vaccines outright (Gisondi et al., 2022). Additionally, some people disregarded face masks and social distancing, potentially causing COVID-19 exposure to high-risk people (Gisondi et al., 2022). Misinformation on social media has also included claims that certain drugs could prevent or treat COVID-19, claims that were often inaccurate or backed by weak scientific evidence (Tuccori et al., 2020). Tuccori et al. (2020) states misinformation influenced the “hazardous use” of drugs, including overdosing and taking drugs with similar ingredients or names. It may also have contributed to drugs, particularly antivirals, becoming sold out which meant those who genuinely needed medication missed out (Tuccori et al., 2020). Violent anti-social behaviours have also resulted from COVID-19 misinformation on social media. The 5G conspiracy theory, an idea spread that mobile phone networks and towers impacted immune systems, saw people in several countries ignore lockdown rules to destroy 5G towers (Nguyen & Catalan-Matamoros, 2020). These behaviours can potentially be credited to COVID-19 misinformation spread by the anti-vaccination community. This is due to the persistent contact and pervasive awareness that social media affords. Social media enables increased availability to others, access to diverse opinions, facilitates prolonged community ties and presents information in ways that is not available in traditional communities (Hampton & Wellman, 2018). Increased persistence and awareness enables misinformation spread which, once seen, can have undesirable social impacts. These impacts and the COVID-19 misinformation spurred by the anti-vaccination movement has engendered proactive responses from various entities.
ENTITIES BATTLE COVID-19 MISINFORMATION ON SOCIAL MEDIA
With so much misinformation about COVID-19, governments, organisations and social media platforms have implemented measures to battle COVID-19 misinformation. These include highlighting social media posts that display inaccurate information and redirecting social media users to articles with facts (Cinelli et al., 2020; Ndiaye, 2021). The Victorian Department of Health deployed messaging in various languages, targeting diverse communities in response to viral misinformation videos spread by anti-vaccinators (Yussuf, 2021). Facebook has a dedicated website that is designed to combat misinformation across their app suite (Jin, 2020). Facebook also published a media release outlining the “aggressive steps” they were taking against COVID-19 misinformation including a COVID-19 Information Center and a partnership with the Independent Fact-Checking Network (Clegg, 2020). Likewise, the WHO established a coronavirus myth busting page on their website to battle misinformation (Nguyen & Catalan-Matamoros, 2020). Twitter is another platform that has acted. One notable action was suspending Donald Trump’s Twitter account for inciting anti-vaccination rhetoric (Germani et al., 2021). Twitter also has a ‘Help Center’ which outlines how they address misinformation on Twitter including: COVID-19 misleading information policy; labelling, limiting or removing content from Twitter; and they proactively feature information during important events such as COVID-19 or elections (Twitter, n.d.). Despite these measures, not all misinformation has been removed from social media platforms (Nguyen & Catalan-Matamoros, 2020). Misinformation remains and so do the potential dangers that come with it. According to Hampton and Wellman (2018), community, persistence and awareness have consequences. Clearly, COVID-19 misinformation is one of these. While it does have its positives, increased persistence and awareness can create mental disharmony and discomfort in people due to clashing beliefs. This can make combatting COVID-19 misinformation difficult.
CONCLUSION
This paper has touched on the anti-vaccination community on Facebook and Twitter; how COVID-19 misinformation influences vaccine hesitancy and other offline social behaviours; and what entities are doing to combat misinformation on social media. The anti-vaccination movement has built strong social media communities and effectively uses Facebook and Twitter to share misinformation which subsequently influences negative social behaviours offline. Research outlined in this paper has shown that the presence of COVID-19 misinformation created by the online anti-vaccination community has direct impacts on what people believe and how they behave. Vaccine hesitancy, disregard of government health guidelines and anti-social acts are some behaviours that have a direct correlation to the anti-vaccination community’s COVID-19 misinformation on Facebook and Twitter. Since the anti-vaccination movement existed prior to social media, and social media has strengthened their communication and community, it is unlikely the movement will be dissipating. Social media has only changed the operation methods of the anti-vaccination community. The ongoing challenge is to drown out anti-vaccination rhetoric on social media so that more individuals are protected against misinformation that may influence their social behaviours and cause harm. To limit harm caused by COVID-19 misinformation, policing needs to occur. Social media platforms should work with national and international authorities to implement stricter measures for published misinformation that may cause harm. While current measures have shown some success, more work needs to be done in this space.
© Chantal Deutrom 2022 All Rights Reserved.
Producer of this work retains all copyright ownership.
REFERENCES
Boczkowski, P. J., Mitchelstein, E., & Matassi, M. (2018). “News comes across when I’m in a moment of leisure.” Understanding the practices of incidental news consumption on social media. New Media & Society, 20(10), 3523-3539. https://journals.sagepub.com/doi/pdf/10.1177/1461444817750396
Cinelli, M., Quattrociocchi, W., Galeazzi, A., Valensise, C. M., Brugnoli, E., Schmidt, A. L., Zola, P., Zollo, F., & Scala, A. (2020). The COVID-19 Social Media Infodemic. Nature, 10(1), 16598-16598. http://dx.doi.org/10.1038/s41598-020-73510-5
Clegg, N. (2020). Combating COVID-19 Misinformation Across Our Apps. Meta. https://about.fb.com/news/2020/03/combating-covid-19-misinformation/
Delanty, G. (2018). Virtual Community: Belonging as Communication. Community (3rd ed., 200-224). Routledge. https://doi.org/10.4324/9781315158259
Germani, F; Biller-Andorno, N. (2021). The anti-vaccination infodemic on social media: A behavioral analysis. PloS One, 16(3). http://dx.doi.org/10.1371/journal.pone.0247642
Gisondi, M. A., Barber, R., Faust, J. S., Raja, A., Strehlow, M. C., Westafer, L. M., & Gottlieb, M. (2022). A deadly infodemic: Social media and the power of COVID-19 misinformation. Journal of Medical Internet Research, 24(2), 1-7. http://dx.doi.org/10.2196/35552
Hampton, K.N., Wellman, B. (2018). Lost and Saved . . . Again: The Moral Panic about the Loss of Community Takes Hold of Social Media. Contemporary Sociology, 47(6), 643-651. https://www.jstor.org/stable/10.2307/26585966
Jamison, A.M., Broniatowski, D.A., Dredze, Mark., Wood-Doughty, Z., Khan, D., Quinn, S.C. (2020). Vaccine-related advertising in the Facebook ad archive. Vaccine. 18(3). 512-520. http://dx.doi.org/10.1016/j.vaccine.2019.10.066
Jin, Kang-Xing. (2020). Keeping People Safe and Informed About the Coronavirus. Meta. https://about.fb.com/news/2020/12/coronavirus/
Megget, K. (2020). Even covid-19 can’t kill the anti-vaccination movement. BMJ : British Medical Journal (Online), 369. http://dx.doi.org/10.1136/bmj.m2184
Puri, N., Coomes, E.A., Haghbayan H., & Gunaratne, K. (2020) Social media and vaccine hesitancy: new updates for the era of COVID-19 and globalized infectious diseases. Human Vaccines & Immunotherapeutics, 16(11), 2586-2593. https://doi.org/10.1080/21645515.2020.1780846
Ndiaye, Aïda. (10 March 2021). Together against COVID-19 misinformation: A new campaign in collaboration with the WHO. Meta. https://www.facebook.com/formedia/blog/together-against-covid-19-misinformation-a-new-campaign-in-partnership-with-the-who
Nguyen, A. & Catalan-Matamoros, D. (2020). Digital Mis/Disinformation and Public Engagement with Health and Science Controversies: Fresh Perspectives from Covid-19. Media and Communication, 8(2), 323-328. http://dx.doi.org/10.17645/mac.v8i2.3352
Oldenburg, R. and D. Brissett (1982). The Third Place. Qualitative Sociology 5(4), 265–84.
Spring, Marianna. 6 October 2020. Trump Covid post deleted by Facebook and hidden by Twitter. BBC. https://www.bbc.com/news/technology-54440662
Twitter. N.d. How we address misinformation on Twitter. https://help.twitter.com/en/resources/addressing-misleading-info
Tuccori, M., Convertino, I., Ferraro, S., Cappello, E., Valdiserra, G., Focosi, D. & Blandizzi, C. (2020). The Impact of the COVID-19 “Infodemic” on Drug-Utilization Behaviors: Implications for Pharmacovigilance. Drug Safety, 43(8), 699-709. http://dx.doi.org/10.1007/s40264-020-00965-w
Wilson, S.L. & Wiysonge, C. (2020). Social media and vaccine hesitancy. BMJ Global Health, 5(10), 1-7. http://dx.doi.org/10.1136/bmjgh-2020-004206
World Health Organisation. N.d. Infodemic. https://www.who.int/health-topics/infodemic#tab=tab_1
Yang, K., Pierri, F., Hui, P., Axelrod, D., Torres-Lugo, C., Bryden, J. & Menczer, F. (2021). The COVID-19 Infodemic: Twitter versus Facebook. Big Data & Society, 8(1). http://dx.doi.org/10.1177/20539517211013861
Yussuf, A. (2021). Anti-vax ‘fearmongers’ spreading misinformation are targeting Australia’s diverse communities, leaders, experts warn. ABC. https://www.abc.net.au/news/2021-09-06/fears-misinformation-targeting-australias-diverse-communities-/100405706
Hey Chantal, what an interesting read! So informative and well-written. I was initially surprised to learn that the anti-vaccination sentiment was more prominent in areas of greater wealth and education thinking that they should know better. Upon deeper analysis, it did make more sense as many of the key contributors to controversial movements are often privileged, white, straight, cisgender and able-bodied individuals. It seems that those who possess such traits traditionally deemed as more accepted within society value things like authority and power as they have grown up being afforded such concepts. I think the point made about algorithms on social media is crucial and quite possibly could be the answer to the issue of misinformation. Once a user stumbles upon one way of thinking online, they are inundated with articles and profiles voicing the same information. It feels like algorithms are inadvertently (or purposely) brainwashing consumers as well as content creators by further validating their false perceptions. Another thing that came to mind as I was reading this paper was clickbait; whilst it is definitely prominent on the cover pages of magazines and titles of newspaper articles, it feels like social media has allowed clickbait to become more insurmountable and pervasive than ever before. Far too often, I hear friends and family relaying information to me seen in the first headline of a social media post only to find out that they haven’t even clicked on the link to read the full article. What I often come to find is that many media publications and content creators are able to put whatever false information they like in the headline of a post, even if it doesn’t actually align with the rest of the content published in the physical article. I wonder if this is a loophole that hasn’t yet been addressed and if you think it is a significant contributor to this problem. Similarly, I wrote about the spread of misinformation within the wellness industry in my paper and touched on vaccine hesitancy in the current pandemic we find ourselves in. My paper focussed more on wellness influencers, such as Pete Evans, who used their platforms to generate a sense of distrust in Western medicine and scientific evidence-based information. Would love to hear your thoughts! https://networkconference.netstudies.org/2022/csm/688/social-media-weaponised-in-the-wellness-community/
Hi Amber, thanks for reading my paper! I also found it interesting (and quite concerning) that people who come from areas of greater wealth and education that believe in anti-vaccination sentiment. You would think that they should know better. I do wonder if the typical white, male cisgender person you describe has more influence on misinformation spread because they have more access to the internet and social media than any other demographic worldwide, such as lower socio economic and other ethnic groups. Perhaps virtual communities are more prevalent in western societies? I agree with what you say about clickbait; I believe offline communities can enable spread of information found online because of clickbait. I think as long as there are community groups there will be gossip and misinformation. I also believe that clickbait headlines should be policed because, as you said, most people will not read full articles and only headlines. This could potentially be explored in future studies of news on social media, an expansion on the paper by Boczkowski et al. (2018). Another interesting concept for future papers would be whether algorithms are complicent in misinformation spread. It could be argued that algorithms, despite catering to users likes and interests, purposely present users with potentially dangerous information. If this is the case, should social media platforms be held accountable for negative outcomes? Should algorithms be removed all together? Or should social media platforms take more action to present users with various opinions and information? It could also be argued that the removal of algorithms could diminish the concept of virtual communities because similar information wouldn’t be collated and presented to similar demographics. What do you think? I look forward to reading your paper 😁
Hi Chantal,
I really enjoyed reading your paper! You have done some excellent research and put forward a strong argument. I was instantly drawn to the title of your paper as I have written a similar essay. The very first essay I wrote when I started Uni in mid-2020 was titled ‘The role of social media and the spread of misinformation’. By the time I had finished my research and essay I had developed a strong loathing for misinformation and the people/groups that spread it. I too used anti-vax and Trump as examples. The damage they have done and continue to do through the dissemination of false information is infuriating. The way they package it so that it sounds legitimate and feeds into people’s biases makes it challenging to combat them. I remember one day (early pandemic when the vaccine was only in the trial stages) my Mum came home from work and said after what someone at work told her she was not going to get the vaccine because it changes our DNA. I asked her where this person got the info from and it was from a friend of theirs on FB who had shared a post from one of her friends who got it off a friend who was anti-vax. So basically, it was 5th-hand information that was entirely incorrect but because the information came from a friend who she trusts she had believed it. This friend also had a few websites that backed up the claim that the vaccine will change DNA. I went online straight away, found the correct information from multiple reputable sources and debunked the misinformation and showed her that the sites her friend used as “evidence” were not reputable or legitimate sources. She then went back to work and told her friend the CORRECT information and has since got the vaccine and no longer gets caught up in the “echo chamber effect” where we are inclined to be more trusting and believing because the information comes from a friend.
Ever since that very first essay, I have ditched mainstream media in favour of The Guardian and ABC News and when it comes to particular information I go and independently fact-check it before passing on the information to family/friends etc Misinformation grinds my gears and it angers me that people intentionally spread false information that can prevent a person from receiving life-saving medicine etc. I agree that stricter measures and policies need to be put in place to prevent the spread of misinformation.
And as harsh as this may seem…I think there should be legal consequences and criminal charges placed on anyone who intentionally creates and spreads misinformation. Would love to hear your thoughts on this!
Great paper!!
Ash
Hi Ash
Thanks! I also get so mad about misinformation. There are people I know who choose not to be vaccinated because of it and I know they’re putting themselves at risk.
I do agree with you that there needs to be stricter policing and legal consequences for misinformation because, as you and I have both found in our research, and can have potentially harmful consequences. I think the issue with policing is that who would be responsible for it. Would it be the social media platforms? Would it be law enforcement? And how would we police misinformation in our country if it comes from overseas? There is a lot of questions that we want answering that has no easy fix, with money and physical borders playing a role. Who would you suggest be responsible for policing and/or legal consequences for misinformation?
With regards to preventing misinformation spread, do you think there’s a chance that this could limit free speech? Is the presence of misinformation something that we need to come to terms with in able to have the ability to communicate freely on social media and the internet?
Also keen to hear what anyone else thinks about my above questions.
Cheers
Chantal
Hi Chantal and Ash
Great read Chantal and feedback Ash. I admire your stance. It’s a shame that we don’t all have the knowledge and awareness of where to find credible evidence-informed information, as opposed to the misinformation found online and spread through what were previously reputable news organisations. Do you think there needs to be more of a focus on this in schools, workplaces and by governments – to educate people on how to dispel misinformation? Rather than relying on platforms themselves to police it?
Despite the original article you mentioned, linking vaccinations to autism, being retracted by the co-authors and The Lancet journal is was published in, and the fact that Wakefield was de-licensed by medical authorities for his deceit, people still refer to it today to support their anti-vaccination stance.
I wrote about misinformation and how health promotion practice can dispute it and use social media platform to create healthier communities – one community at a time! In my research I found that more than a quarter, equating to over 62 million views, of the most watched COVID-19 related videos on YouTube, consisted of misinformation and were misleading. I’d love to hear what you thought of it: https://networkconference.netstudies.org/2022/csm/325/social-media-and-online-communities-provide-opportunities-for-health-promotion-practice-to-increase-its-effectiveness-and-dispel-health-misinformation/
Hi Genevieve,
Yes there definitely needs to be more education on misinformation. The more places the better! If we get schools, workplaces and governments to implement more education, training and awareness of misinformation then it can be tailored to specific audiences and regions. I know of a couple of older people in my family who have think that ‘if it’s on the internet then it must be true’. So frustrating! As always, education is key.
With regard to Wakefield and people referencing him, it’s clear that people supporting anti-vaccinators are dismissing the facts to fit their agenda. What could be more ignorant and deceitful than that? Do you think they should face consequences because of this? If so, what consequences?
I can’t believe that more than a quarter of COVID-19 videos on YouTube contained misinformation or were found to be misleading. How scary! There definitely needs to be more auditing of content that is posted online so that action is taken against misinformation before it can be viewed and shared. However, this could impact freedom of speech and the essence of virtual communities. As Delanty (2018) states, communication is belonging. How do you think we should navigate this further?
Cheers
Chantal
Hi Chantal
From what I understand Wakefield himself continues to share his opinions and is still considered a voice of authority and experience for the anti-vax cohort. For some, when it comes to autism in particular, I think they just want to understand how and/or why it occurs. Other than an acceptance that its is influenced by a combination of genetic and environmental factors there is still not a singular unifying cause identified (Hodges et al. 2020).
I believe we need a combined approach from social media platforms, governments, workplaces, schools and individuals if we are to raise awareness and educate on misinformation. There will always be people and communities with differing opinions, we just need to reduce harm as much as possible and show respect. Easily said I know!
Best, Gen
Reference:
Hodges, H., Fealko, C., & Soares, N. (2020). Autism spectrum disorder: definition, epidemiology, causes, and clinical evaluation. Translational pediatrics, 9(Suppl 1), S55–S65. https://doi.org/10.21037/tp.2019.09.09
Hey Chantal,
I really enjoyed your article. Since the covid19 pandemic, there has also been a lot of fake news on social media.
For example, the misinformation about vaccines mentioned in your article. Some people are passing on the wrong vaccine information on social media for their own benefit, causing many people to be afraid to get vaccinated, which is really an abominable act. I remember in the early days of covid19, there were many news stories promoting the refusal of masks. A friend of mine also refused to wear a mask because of the news. As a result, he contracted covid and became very ill and was admitted to the ICU.Fortunately, he has now recovered. But the fake news during covid19 has been a serious danger to people’s lives.
But as you say in your article, the regulation of fake news today is not good enough to completely curb it, so I think the most important thing we should do is to make sure that we and our family and friends are not affected by fake news.
Hi Sining,
First of all, I’m glad your friend is ok. They are a clear example of the harmful consequences that misinformation can have. I think fearmongering plays a major part in misinformation; people act a certain way because misinformation makes them afraid. See Yussuf (2021) and Nguyen & Catalan-Matamoros (2020). Fear, I think, is lack of knowledge. As stated above to Genevieve: education is key! In addition to educating our friends and family, how else do you think we should monitor, prevent and regulate misinformation?
Cheers
Chantal
HI Chantal,
Your paper was an interesting read and especially as misinformation is so prevalent in todays age, regardless of your position on the political spectrum it’s an important discussion to age. Anti-vaccination rhetoric is something that is still puzzling and while I’m not surprised that people who are apart of the upper class peddling this type of discussion, the trickle down to more middle and lower socioeconomic status individuals makes me worried about different types of misinformation such as in politics, working rights and even just general understandings of the world around you. I wonder if you think that the people who view this type of content online are victims of this type of misinformation as they have no where else to turn for advice, or better yet, just want to belong to a social group so badly, that they will rhetorically regurgitate everything that group says to simple feel welcomed in a community? I wrote a paper on how social media facilitates a lack of offline action in regards to political movements and misinformation was one of my big key arguments surrounding the lack of action. I would be interested in hearing your thoughts about it. https://networkconference.netstudies.org/2022/onsc/381/social-change-in-online-networks-how-social-media-facilitates-a-lack-of-action-within-real-world-political-movements/
Regardless, great paper and a topic that definitely needs to be discussed more.
Best Regards,
Jack
Hi Jack,
I agree – tacking misinformation is so important. There definitely needs to be more proactivity in this space. I think anti-vaccination rhetoric is as prevalent as it is because it comes from places of influence. I doubt that middle and lower socioeconomic groups would have much of a voice in this space unfortunately. However, I agree that anti-vaccination rhetoric does impact these groups. I think everyone has access to the same information but it’s more about knowing that the information is there and where to find it. It’s interesting that you mention that want to belong. I was considering mentioning this in my paper but I couldn’t fit it in and the research I found was more along the lines of psychology. I do agree that wanting to belong to a group would influence people joining the anti-vaccination community. As humans, we have a clear need to belong. Do you think the need to belong to a community facilitates misinformation spread online?
Cheers
Chantal
Hi there Chantal,
Thank you for the reply.
I think it’s really interesting around the concept of “do your own research” or “everyone has access to the same information”. Throughout multiple reports from ex-engineers at Google, investigative journalists at Forbes, The New York Times and multiple other credible outlets, the ways in which algorithms, especially around your location, affects the search results and “information” you get is insane. Meaning if your an anti-vaxxer, you are simply going to get more information to reaffirm this thought process. Even around topics like Climate Change, it’s a similar result where if you live in an area where climate change’s legitimacy is debated heavily, your Google results will be effected. It’s why the saying “do your own research” is kinda irrelevant now as “your research” could essentially just reaffirm everything you already believe. Additionally, belonging to a group is a concept that has been around for a long time. It’s not mutually exclusive thing for online communities. But I think it’s more around the opportunities arising for people to belong due to the accessibility of online social media platforms that creates this level of misinformation. I don’t think communities necessarily facilitate and foster misinformation, I think it’s just around it has a bigger opportunity to be created. I would be interested to see the research behind this (as I just don’t have enough information currently to form a proper opinion on the matter) and see what is currently suggested by experts in the communications and politics field. Great question though and the response was very thought provoking.
Hi Jack,
Yes I suppose you’re right! “Everyone has access to the same information” might be a bit of a naïve statement. When I said that I wasn’t even thinking about algorithms and echo chambers. Perhaps it should be investigated why people get different information. It almost enables opportunities for healthy and unhealthy debates. I think some communities do facilitate and foster misinformation, online and offline – just look at anti-vaccinators. I agree that it would be interesting to see further research on these topics
Cheers,
Chantal
Hi Chantal,
Your essay was really interesting, like Amber I too was surprised to read that the wealthier and well educated were more likely to spread misinformation. Do you think censorship of COVID-19 vaccine misinformation was introduced swiftly enough? Do you think the damage was already done?
My article was about knowledge sharing about medical conditions on social media, and one of the concerns I’d raised was the risk of medical misinformation. I’d like to hear your thoughts on it! https://networkconference.netstudies.org/2022/csm/843/a-challenge-of-knowledge-power-and-gender-equality-endometriosis-online-support-communities/#comment-1601
Hi Taylah,
Yes I was surprised and concerned to find that out as well. I always assumed that misinformation would come from lower socioeconomic groups and people with less access to education. Overall, it’s difficult to determine when censorship should be used because it could impact freedom of speech and access to different points of view. However, I do believe that COVID-19 and other health information should only be presented by credible sources. I don’t think censorship of COVID-19 misinformation was introduced swiftly enough and I actually think social media companies could do more. Yes, some posts are deleted but others remain and simply have an alert along the lines of “this post contains information about COVID-19”. I think as soon as information is online it’s already too late. There needs to be more moderation of COVID-19 related posts before they are published on the internet, but, again, I worry about freedom of speech.
Cheers,
Chantal
Hi Chantal,
Wow! what a great paper. It was so informative and well written and is still such a relevant topic (especially in WA). I thought it was really great you pointed out the difference between misinformation and disinformation because these two things can have drastically different meanings and I think it’s very important for people to know. For people who want to avoid fake news and the wrong information being spread (specifically about COVID-19) what are some trusted resources and places people should turn to get the correct info? and how do you think we stop the spread of mis/disinformation at its source? is it education about the spread of fake news? would love to hear your thoughts.
Amazing work 🙂
– J
Hi Jessica,
Thank you! I actually thought misinformation and disinformation was the same thing initially. It was interesting to see the different sources and impacts of misinformation and disinformation. If I had the answer on how to avoid misinformation and fake news I think I would be rich haha! Personally, I don’t search for information but I still have misinformation presented to me on social media platforms. It’s really annoying. Almost every day I report a post or hide ads. For me, the most trusted resources and places people should turn to for correct information should be organisations like the WHO and government agencies, along with recognised private health practices. Then again, there would be quite a few sources so how do you narrow them down to be the go-to sources for accurate information? I think there should be moderating offices implemented by social media agencies in each country. There probably is already but maybe offices with a more narrow focus. Like if a post mentions COVID-19, it’s then flagged by the office who can then be like “ok, this information is false, not in line with government health guidelines, and could negatively influence people in a way that could place them and people surrounding them in harm’s way” so then the post isn’t published and doesn’t reach an audience. However, even with measurements like this in place, people will still find a way around moderators and still post misinformation. E.g. people posting “s*it” or “cant” to get around social media pages that don’t allow swearing. And it would be so difficult to keep track of every post. Lots of things to think about and be implemented by people that are getting paid way more than us… haha.
Cheers,
Chantal
Hi Chantel,
This topic is one that needs to be discussed as much as possible due to its relevance to all communities over the globe. It’s quite alarming how social media has made it, as you mentioned so easy to spread misinformation about COVID-19 when it’s such a panic provoking topic in the first place. I didn’t realise how many Facebook ads promoted anti-vaccination themes! Based on your article, I think social media sites could regulate the spread of misinformation more effectively since its algorithms promote the appearance of similar content. How do you believe social media sites could better the regulation of such misleading content?
-Zoe
Hi Zoe,
I think you’re right. It’s as if social media platforms chuck a grenade (algorithms, enabling echo chambers) and walk away from the explosion (misinformation, arguments, panic). I think there needs to be more education to begin with. Hopefully with more awareness about misinformation and other relevant topics, people will be less inclined to spread it and seek it out. Additionally, people would be more aware of what is true or not and what sources to trust. Secondly, social media platforms need to step up moderation. It does look like Facebook is implementing measures. Interestingly, it appears some misinformation is financially motivated. Facebook has also stated that they aren’t the arbiters of truth given their scale, and is putting responsibility of reporting back onto the people. https://www.facebook.com/formedia/blog/working-to-stop-misinformation-and-false-news. Considering this, maybe grassroots movements would be more feasible in combatting misinformation on social media rather than the social media platforms themselves. Do you agree?
Cheers,
Chantal