skeleton, skull, smile-5759062.jpg

Twitter Infodemic is Still Empowering Refracted Publics in 2023

Posted on

by


Abstract

This paper discusses communities that were formed because of the infodemic. It also discusses some causes of the infodemic and how refractive publics formed on Facebook, YouTube and Twitter during the pandemic. It will discuss how hashtags were used to send users to pages that contained misinformation about COVID-19. It will also show how Facebook combated these issues to minimise the spread of misinformation by refracted publics. Finally, it will argue that Twitter continued these affordances for refracted publics in 2023 and how these communities are still negatively spreading misinformation on Twitter.

AlSamarrieRefractedPublicsOnTwitter

Refracted publics, Twitter, infodemic, hashtag jacking

 

Conference paper

Since the end of 2019, the world has endured the COVID-19 virus. During this time, people were searching for answers. The World Health Organisation coined the term “infodemic”, causing the creation of new communities and the spread of misinformation (Omar & Casero-Ripolles, 2023). Some new online communities formed on social networking sites such as Facebook, YouTube and Twitter as a direct result of this epidemic. One example is the World Doctors Alliance which refuted COVID-19 recommendations and spread misinformation that contributed to the infodemic. There were also confusing and contradictory messages about responding to this crisis. As a result, anxiety levels began to increase for online community members (Omar & Casero-Ripolles, 2023). These emotions were problematic because users on social media conveyed news and information about COVID-19, and many users turned to these platforms for guidance (Omar & Casero-Ripolles, 2023). The result was that refracted publics were being formed and consolidated. Refractive publics are complex to define and explain. However, in the context of the infodemic, they can result from community strategies that incorporate unsubstantiated claims, conspiracy theories, and highly biased viewpoints through hashtag jacking (Abadin, 2021). Consequently, the COVID-19 infodemic has caused the ongoing formation and consolidation of refracted publics operating through Facebook, Twitter and YouTube; however, to this day, Twitter is the most prevalent infodemic community. This conference paper will argue how mobile technologies in online communities influence Facebook and Twitter groups by negatively influencing users with misinformation and unsubstantiated claims about COVID-19. It will also argue information about the infodemic on Facebook, Twitter, and YouTube and how they negatively influenced users with this misinformation on these platforms. Finally, it will discuss the idea of refracted publics and argue how these areas affect the various communities on Twitter today through tactics such as discoverability and hashtag jacking.

 

In the 21st century, mobile technologies make social media easily accessible for users and groups and can enhance a sense of belonging. Although this accessibility is broad, it still applies to refractive publics and is a problem today. This accessibility means these users are no longer bound by location, time or space. For example, a user could post harmful information on their mobile phone from any location and receive responses from the other side of the world. Boczkowski et al. (2018) also found that social media users could check their social media pages from their phones and mobile devices. Delanty (2018) also found that sites such as Twitter and Facebook have attempted to eliminate the boundaries of communication and improve communication efficiency. This advancement in technology and communication can consolidate communities and help to connect people (Delanty, 2018). These findings are more problematic as refracted publics can be consolidated by communicating harmful text and media posts. These findings prove that mobile technologies improve communication, eliminate distance, space and time obstacles, and enhance user and group connections for refracted publics.

 

As well as the convenience and accessibility of mobile technologies, the COVID-19 infodemic made news easy for refracted publics to access and generated content counterproductive to the recommended medical advice and government guidelines. The World Health Organisation warned social media users about misinformation contradicting and misrepresenting COVID-19 recommendations (Omar & Casero-Ripolles, 2023; Demuyakor et al., 2021). Furthermore, this information contributed to confusion, misunderstanding and heightened emotions among social media users (Cinelli et al., 2020). Cinelli et al. (2020) found that over 45 days during COVID-19, social media generated 8 million posts. Due to this excessive number of posts, many forms of information would have contained multiple forms of media protesting the COVID-19 pandemic and the authorities involved (Theocharis, 2021). Omar and Casero-Ripolles (2023) found that 75% of their sample across three countries followed news intently on Facebook to be informed about what was happening during the pandemic. Boczkowski et al. (2018) also found that social media interactions and consumption were brief and incidental. Consequently, due to the accessibility of this misinformation, refracted publics dealt with unnecessary confusion and information overload. 

 

As the infodemic progressed, refracted publics continued to grow on social media. As well as the misinformation on social media sites during the COVID-19 pandemic, conspiracy theories bombarded these sites. Conspiracy theories are systems of belief that implicate a higher power to an event, group of people or individual (Theocharis, 2021). For example, the numerous COVID-19 deaths (people) were a government (higher power) plan to decrease the population and repair the economy. These conspiracies could be related to religious propaganda, government takeover and human genocide. Due to these beliefs from the fake news on social media, groups started to mistrust sound government advice regarding COVID-19 (Demuyakor et al., 2021). Some found that Twitter had a more significant adverse effect on conspiracy theories and the distribution of those theories than other social media platforms (Theocharis, 2021). Furthermore, many studies found that Facebook, Twitter and YouTube were detrimental in providing groups with misinformation and, in turn, contradicting sound vaccination advice and scientific research (Demuyakor et al., 2021). Based on this information, the misinformation on COVID-19 conspiracy theories negatively affected social media platforms (Yang et al., 2021). Some of these theories included that COVID-19 did not exist and that COVID-19 was designed as a weapon to kill people (Lynas, 2020). Consequently, this misinformation and conspiracy theory user groups contributed to the growth of refracted publics on social media. 

 

Platforms such as Facebook and Twitter allowed refracted publics to thrive as they accommodated the formation and consolidation of diverse and complex online communities. For example, users were free to publicly and privately post and message their bias and misinformed perspectives contributing to the infodemic (Visentin, 2021; Twitter Inc, 2023). Community is about transparency, communication and acceptance, and Facebook and Twitter provided a space for this virtual community to be just as effective as a real community (Delanty, 2018). Facebook and Twitter allowed users to connect and interact personally (Hampton & Wellman, 2018). These communities were diverse and open to many types of people who share like-minded interests and passions (Delanty, 2018). Furthermore, strengthening these communities was contingent on the passions and interests of various users. Consequently, Facebook and Twitter facilitated refracted publics through the complexity of community formation through lifestyle accommodation, personal interaction, like-mindedness and personal and social connections.

 

During the COVID-19 pandemic, refracted publics were formed and consolidated from the infodemic wave of information through the social connection and sharing of various forms of media. These specific communities were called “pernicious communities” because users were only interested in validating their views and ideals and connecting with users who agreed with them (Parsell, 2008, as cited in Delanty, 2018). These communities were prevalent during the COVID-19 pandemic, and this community engagement allowed for persistent contact and pervasive awareness. An example of persistent contact on Twitter included like-minded users exchanging tweets, images, and media about the pandemic through constant retweeting (Hampton, 2016; Hampton & Wellman, 2018). This persistent contact integrated with pervasive awareness meant that the person did not have to be present while receiving images, videos, and tweets (Hampton, 2016; Hampton & Wellman, 2018). Theocharis (2021) reiterates that this one-sided communication and content sharing can be detrimental to situations like the COVID-19 pandemic, as users only sometimes understand what they are receiving or how to engage with the content. These images could show photos of users protesting at COVID-19 rallies, explicit images of reactions to the needle or sarcastic memes hinting at government policies. Consequently, the persistent contact and pervasive awareness formed and consolidated refracted publics during the COVID-19 outbreak.

 

YouTube and Facebook have addressed the infodemic dilemma; however, Twitter is still a harmful environment responsible for consolidating and forming refracted publics. In 2023, YouTube and Facebook do not contain as much negative COVID-19 information; however, Twitter still needs to combat the misinformation related to COVID-19 and facilitate the continuation of refracted publics. In 2021, Facebook banned all conspiracy theories related to COVID-19 and infodemic information on their platform (Visentin, 2021). Before the ban, Facebook accommodated pseudoscience influencers like the “World Doctors Alliance”. From 2020 to 2021, they had 5.7 million interactions (Gallegher, 2021). Today in 2023, if a user tries to search for this organisation and the prominent influencers, they will find that this group has fewer than 200 members (World Doctors Alliance, n.d). YouTube has videos that refute conspiracy theories and beliefs related to the pandemic. Educational videos countering these theories are provided if a user searches for anything related to conspiracies. However, Twitter contains videos that perpetuate misinformation. Hashtags such as #scamdemic and #plandemic will take users to numerous Twitter accounts that contain graphic images of alleged reactions to the vaccine, graphs that show unsubstantiated figures of people who have died because of COVID-19, and videos of people sharing their views from unqualified perspectives. Consequently, these biased and unchallenged Twitter accounts are another example of detrimental misinformation on social media (Yang et al., 2021). These videos include government brainwashing warnings, life-threatening side effects of vaccines, extreme views about government takeovers and the general controlling of people. Therefore, the affordances of Twitter consolidate the formation and continuation of refracted publics.

 

Even after the influx of the infodemic, refracted publics are still evident on Twitter in 2023. This information, readily available on Twitter, can make it faster and easier for communities to consume content meaningfully and can result in the continuation of the infodemic (Abadin, 2021). Although the content related to the infodemic is not as prevalent as it was when the pandemic started, it still contains information detrimental to communities. Abadin refers to this bombardment as perpetual content saturation. For example, @scamdemic97 is a Twitter user who has posted over 39, 000 tweets (I did try to warn you, n.d). This community member uses the name “I did try to warn you”. These tweets include retweets of vaccination failures and future pandemic outbreaks. Also concerning is that this community influencer has also retweeted about politics, financial sectors, conspiracy theories, Bill Gates’s death panels and many more unsubstantiated claims. It is alarming that this user has over 2,500 followers and is following over 3,000 other users. Consequently, these false COVID-19 claims are only one way that refracted publics exist on Twitter today.

 

In the context of refracted publics, hashtags and searches lead community users to unintentional and confrontational places. The infodemic is still prevalent on Twitter, and specific communities use usernames @ symbols and hashtags to convey their messages. Abadin (2021) refers to this unintentional message discovery as discoverability. For example, the user “Trust God!” retweets videos that turn viewers against medical advice due to wearing masks. This page is deceptive because users who type “Trust God!” expect something religious and encouraging. When users type #TrustGod, they can find other users like Santino Rice. Abadin (2021) refers to these types of hashtags as “hashtag jacking”, as these hashtags manipulate users to other posts that deviate from the user’s original intention. Santino Rice also uses the #TrustGod along with #PfizerLiedPeopleDied and #scamdemic. This user has over 71,000 followers, is a pandemic conspiracy theorist and is against vaccinations (Rice, n.d). Also, they have posted over 12,000 tweets. Another example is when users type #truth. This hashtag directs users to more information about the infodemic and contains short videos and posts. An example is Judy A. Mikovits, PhD, a biomedical research scientist and conspiracy theorist (Mikovits, n.d). Dr Mikovits has more than 159,000 followers. Consequently, combining these search methods leads users to unknown places that consolidate the infodemic and refracted publics.

 

Next, refracted publics questions the credibility of government policies on COVID-19. This mistrust in the credibility of reputable government and medical authorities is called information distrust (Jack, 2017, as cited in Abadin, 2021). Lucky Number 7 or @Luckynu777 retweets posts that refute the existence of COVID-19 and label it as a fake disease. This user has also retweeted the predicted outbreak of another future pandemic (Lucky Number 7, n.d). Abadin (2021) warns against using unverifiable, speculative and confusing sources. Lucky Number 7 has over 1,500 followers and has posted over 16,500. One of their retweets contains a post from Patrick Henningsen. Patrick Henningsen, or @21WIRE, has over 106,000 followers. Mr Henningsen classes himself as a “geopolitical analyst, indy journo and host”. He has posted several confronting images, videos and tweets questioning the validity of COVID-19. He also comments on infertility and blames infertility rates on the COVID-19 vaccination (Henningsen, n.d). Unfortunately, Mr Henningsen has no authority to comment on this topic as he is not a medical or vaccination expert. Therefore, these contradictory comments against credible sources are detrimental and unhelpful to users (Yang et al., 2021). Unfortunately, these are only a few examples of many refracted publics operating through Twitter.

 

Refracted publics are still operational on Twitter, as the information related to the infodemic is somewhat attainable for community members. According to Abadin (2021), this information is aimed at communities who add to the infodemic and may hold controversial views about the pandemic and the state of the world. If users type in “#VaccineSideEffects”, they will find negative results of COVID-19 injections relating to severe illnesses, future complications and deaths. Posts linked to this hashtag malign these injections and pharmaceutical companies and use a small sample size to justify their credibility and validity. Wide Awake Media, or @wideawake_media, has over 13,000 followers and openly states that the COVID-19 injection is killing people. Once again, they post controversial videos of people suddenly dying and Bill Gates discussing how the population needs to decrease (Wide Awake Media, n.d). They also quote that the rest of the world is out of touch with reality. If users type “#covid19”, they are shown news reports of people dying, directly blaming the vaccine. Another user from this search is William Makis, MD or @MakisMD. This search will lead users to Dr Makis’s account, which contains random and unexplainable deaths that are, again, blamed on COVID-19. Doctor Makis also asserts that the injections are causing increased cancer rates (Makis, n.d). The alarming statistic is that Dr Makis has over 35,000 followers. These bogus claims relate to extreme views of COVID-19 and equip refracted publics today (Abadin, 2021). Consequently, these communities are evident in the Twitter environment due to the affordances of these search methods and the accessibility of controversial content.

 

The ongoing information and consolidation of refracted publics on Facebook, Twitter and YouTube were influenced by the COVID-19 infodemic. Furthermore, infodemic communities are still prevalent on Twitter today. The affordances of mobile technologies have made group formation, connection, and consolidation accessible. The constant bombardment (perpetual content saturation) of unnecessary information caused stress and confusion among social media communities. Next, the distribution of COVID-19 conspiracy information negatively influenced Facebook and Twitter groups. Moreover, forming a community on Facebook and Twitter was complicated. Also, persistent contact and pervasive awareness facilitated community formation during the COVID-19 pandemic. Platforms like YouTube and Facebook attempted to eliminate misinformation, while Twitter did not. Refracted publics are evident on Twitter today. The platform allows them to use specific search methods such as usernames, symbols (discoverability) and hashtags (hashtag jacking) to lead users to their questionable communities intentionally. Additionally, information distrust is also evident among numerous users. Finally, COVID-19 information has blended with conspiracy-based posts related to the infodemic. These findings confirm that refracted publics are still operational, discoverable, and influential on Twitter in 2023.

 

 

References

 

Abidin, C. (2021). From “networked publics” to “refracted publics”: A companion framework for researching “below the radar” studies. Social Media + Society, 7(1), 205630512098445. https://doi.org/10.1177/2056305120984458

 

Boczkowski, P. J., Mitchelstein, E., & Matassi, M. (2018). “News comes across when i’m in a moment of leisure”: Understanding the practices of incidental news consumption on social media. New Media & Society, 20(10), 3523-3539. https://doi.org/10.1177/1461444817750396

 

Cinelli, M., Quattrociocchi, W., Galeazzi, A., Valensise, C. M., Brugnoli, E., Schmidt, A. L., Zola, P., Zollo, F., & Scala, A. (2020). The COVID-19 social media infodemic. Scientific Reports, 10(1), 16598. https://doi.org/10.1038/s41598-020-73510-5

 

Delanty, G. (2018). Virtual community: Belonging as communication. In Community (3rd ed., pp. 200-224).

 

Demuyakor, J., Nyatuame, I. N., & Obiri, S. (2021). Unmasking covid-19 vaccine “infodemic” in the social media. Online Journal of Communication and Media Technologies, 11(4), e202119. https://doi.org/10.30935/ojcmt/11200

 

Gallagher, F. (2021, December 3). Facebook ‘failing’ to tackle covid-19 misinformation posted by prominent anti-vaccine group, study claims. ABC News Networkhttps://abcnews.go.com/Technology/facebook-failing-tackle-covid-19-misinformation-posted-prominent/story?id=81451479

 

Hampton, K. N. (2015). Persistent and pervasive community: New communication technologies and the future of community. American Behavioral Scientist, 60(1), 101-124. https://doi.org/10.1177/0002764215601714

 

Hampton, K. N., & Wellman, B. (2018). Lost and saved . . . again: The moral panic about the loss of community takes hold of social media. Contemporary Sociology: A Journal of Reviews, 47(6), 643-651. https://doi.org/10.1177/0094306118805415

 

I did try to warn you [@scamdemic097]. (n.d.). Tweets [Twitter profile]. Twitter. Retrieved April 7, 2023, from https://twitter.com/scamdemic097

 

Judy A. Mikovits PhD [@DrJudyAMikovits]. (n.d.). Tweets [Twitter profile]. Twitter. Retrieved April 7, 2023, from https://twitter.com/DrJudyAMikovits

 

Lucky Number 7 [@Luckynu777]. (n.d.). Tweets [Twitter profile]. Twitter. Retrieved April 7, 2023, from https://twitter.com/Luckynu777

 

Lynas, M. (2020, October 12). COVID: Top 10 current conspiracy theories. Alliance for Science. https://allianceforscience.org/blog/2020/04/covid-top-10-current-conspiracy-theories/

 

Omar, R. G., & Casero-Ripollés, A. (2023). Impact of influencers’ facebook pages in cultivating fear and terror among youths during the covid-19 pandemic. Online Journal of Communication and Media Technologies, 13(2), e202314. https://doi.org/10.30935/ojcmt/13005

 

Patrick Henningsen [@21WIRE]. (n.d.). Tweets [Twitter profile]. Twitter. Retrieved April 7, 2023, from https://twitter.com/21WIRE

 

Santino Rice [@SANTINORICE]. (n.d.). Tweets [Twitter profile]. Twitter. Retrieved APRIL 7, 2023, from https://twitter.com/SANTINORICE

 

Theocharis, Y., Cardenal, A., Jin, S., Aalberg, T., Hopmann, D. N., Strömbäck, J., Castro, L., Esser, F., Van Aelst, P., De Vreese, C., Corbu, N., Koc-Michalska, K., Matthes, J., Schemer, C., Sheafer, T., Splendore, S., Stanyer, J., Stępińska, A., & Štětka, V. (2021). Does the platform matter? Social media and COVID-19 conspiracy theory beliefs in 17 countries. New Media & Society, 1-26. https://doi.org/10.1177/14614448211045666

 

Twitter, Inc. (2022, July 28). COVID-19. Twitter Transparency Center. https://transparency.twitter.com/en/reports/covid19.html#2021-jul-dec

 

 

Visentin, L. (2021, February 9). Facebook bans vaccine conspiracies in covid-19 misinformation crackdown. The Sydney Morning Heraldhttps://www.smh.com.au/politics/federal/facebook-bans-vaccine-conspiracies-in-covid-19-misinformation-crackdown-20210209-p570vh.html

 

Wide Awake Media [@wideawake_media]. (n.d.). Media [Twitter profile]. Twitter. Retrieved April 7, 2023, from https://twitter.com/wideawake_media

 

Wide Awake Media [@wideawake_media]. (n.d.). Tweets [Twitter profile]. Twitter. Retrieved April 7, 2023, from https://twitter.com/wideawake_media

 

William Makis MD [@MakisMD]. (n.d.). Tweets [Twitter profile]. Twitter. Retrieved April 7, 2023, from https://twitter.com/MakisMD

 

World Doctors Alliance. (n.d.). Home [Facebook page]. Facebook. Retrieved April 7 , 2023, from https://m.facebook.com/groups/417897127069226/?ref=share&mibextid=l066kq

 

Yang, K., Pierri, F., Hui, P., Axelrod, D., Torres-Lugo, C., Bryden, J., & Menczer, F. (2021). The covid-19 infodemic: Twitter versus facebook. Big Data & Society, 8(1), 205395172110138. https://doi.org/10.1177/20539517211013861

 

 


Search Site

Your Experience

We would love to hear about your experience at our conference this year via our DCN XIV Feedback Form.

Comments

6 responses to “Twitter Infodemic is Still Empowering Refracted Publics in 2023”

  1. Charlotte Phillips Avatar
    Charlotte Phillips

    Hi Caesar,

    Thanks for a great contribution to the COVID-19/infodemic discourse on Twitter!

    I found hashtag jacking such a fascinating subject, I hadn’t heard of it before reading Abidin’s paper – I discuss the method in my work as well. Do you think there is any way we can police this one? Or should we, at all? Free speech is quite a big conversation to be had as well!

    Twitter is definitely a top choice for refracted publics to permeate and further influence individuals online. The amount of medical professionals openly publishing content with an anti-vax agenda is quite worrying! What do you think Twitter could do to improve their response to misinformation and “fake news”?

    My paper is quite similar to yours but goes more into depth regarding the anti-vax community specifically – would love to hear your thoughts!

    Charlotte.

    1. caesar.al-samarrie Avatar

      Hi Charlotte,

      Thanks for your encouraging words. I don’t think that we can police hashtag jacking or fake news but we can educate people to think critically about what they read and how to evaluate deceptive tactics of refracted publics. Ultimately, we will always be bombarded with information so we need to be educated enough to sift through people’s hidden agendas and misconceptions. I am looking forward to reading your paper.

      Thanks,

      Caesar

  2. Stephen.B.Bain Avatar
    Stephen.B.Bain

    Hi Caesar,

    Your paper certainly gave me another reason to look down on Twitter; however yours is more academically presented than my own personal ‘there’s something about it that I don’t like’!

    Do you think that, in future, Facebook and/or YouTube will do more to spread their ability to identify and manage ‘refracted publics’ across to other topics ? Is over-regulation by platforms a risk ?

    Cheers
    Steve

  3. caesar.al-samarrie Avatar

    Hi Steve,

    The key is not to rely on these platforms to police other topics but to be proactive in presenting online users with facts and critical thinking skills. Facebook and YouTube have done an excellent job of stopping these COVID-19 misconceptions. However, I think that will only do something if it becomes a significant societal issue and if it affects numerous users. They have no obligation to police our views on subjects, but I think that their main focus is to provide us with a platform that we will engage with and expand for them.

    Thanks,

    Caesar

  4. Peter Avatar
    Peter

    HI Ceasar,
    Like Charlotte, the idea of hashtag jacking is new to me. So.. thank you for that.
    I am just wondering, how does the common argument to allow free speech distort the self-perceived role of the social media companies in allowing alternative facts and conspiracies to be promulgated?
    As far as I know, these companies exist to allow a platform for self expression and a sharing of ideas.
    When social media companies regulate posts to conform to their “standards” aren’t they promoting censorship?
    If punters choose to overlook science, medicine, and edited professional media and choose instead to believe the fanciful , contrarian comments of Joe Blow from Indiana,Timbuktu or 1600 Pennsylvania Ave….. are we to blame the SM companies for this lack of regard for self preservation by potential Darwin award recipients? Is there an argument that Facebook and Youtube are doing the more harm by censoring content?

    Cheers,

    Peter

  5. caesar.al-samarrie Avatar

    Hi Peter,

    You raise some thought-provoking points. However, when the World Health Organisation deems this misinformation by these communities as a significant cause of the infodemic, it takes their freedom and influence too far. Also, the deceptive scare tactics that refracted publics use to share unwanted and pushy agendas (hashtag jacking) are encroaching on someone else’s freedom. A person did not search for this information directly, but these community users misled them. The issue is that these communities are poisoning impressionable minds and misusing social media. I know this will not change soon, but if we educate people, they may choose and not be deceived by these toxic communities.

    Thanks again for your feedback,

    Caesar

Skip to content