Social media algorithms and popular trends have influenced the perception of COVID-19 more than the truth. We are in an era where everything – news, entertainment, education, and social interaction is at our fingertips. It is not just instant accessibility to an endless pool of information, but it is also having the power to access these things whenever we want. Seeking the truth about issues such as COVID-19 will increasingly become an effort for users, as they navigate the web full of misinformation and as algorithms fill their feeds with what they like and want to hear, rather than what is verified to be true (Ceinelli et al., 2020). On the flip side, health campaigners are needing to keep up with social media to have the right information on the platforms where people spend most of their time on and utilise the power of algorithms (Lingyao et al. 2021).

 

It should be no surprise to see the world unfold in chaos when it faces something as uncertain and life-altering as the Coronavirus. When the virus broke out, social media was the first place people turned to for answers – which highlights the power and influence social media has in our lives (Chen et al. 2021). When something affects people personally, there is a greater level of emotion driving the sharing of information (Wang et al., 2021). Social media, largely being the platform for self-expression, is filled with endless opinions and information – with people more likely to share negative information than positive. Such tendencies have become a great barrier for public health campaigners, as what wins over public attention is not necessarily fact-based information, but whatever is popular online (Wang et al., 2021). Think about how we judge if a product or service is good or not – we often check the reviews for it and see how many people have interacted with it or how big the product or service’s socials are. Similarly, society has evolved to “fact-checking” the COVID-19 information we come across by either trusting our close circle of friend’s words, checking if we believe the website or company to be trustworthy or not or even seeing how many shares and likes the post has gotten. Such phenomenon of misinformation spreading online has been coined by The World Health organisation as the “infodemic” (WHO, n.d.).

 

Lingyao et al. (2021) suggests to public health campaigners and policymakers that social media is something not to despise, but to make full use of. Their article shows that social media is where attitudes and perceptions on health policies and mandates can be observed easily and used to shape future decisions better by being in social media spaces. However, care must be taken in carrying out observations to navigate one-sided and algorithm boosted opinions to ensure that it isn’t just the loudest voices that are listened to.

 

Social media should also be utilised to disseminate truth by running active accounts on common platforms such as Facebook, Instagram, Twitter and even TikTok (for younger audience demographics) to increase reach. Considering more often than not we use our devices as the sole source for our entertainment, social connections, education, help, news and so forth, it makes sense moving forward that a way to have accurate information spread is to have it on the right platforms in the first place. It needs to be presented in ways that make it shareable – for example, making professional but interesting reels that can update people on important things easily and having subtitles for an easy read along. With the help of the algorithm, the reel can reach more people that are not necessarily following the page or hashtags, especially those in the same area of the organisation. The National Health System (NHS) in the United Kingdom exemplifies this strategy in its NHS vaccine campaign. Stickers, filters and lens with the message “I’ve had my COVID vaccine” were created and made available across Snapchat, TikTok and other social media accounts for users (MENA Report., 2021). A question and answer session was even held on Snapchat where users could ask government officials on the Prime Minister’s official account COVID-19 questions (MENA Report., 2021).

 

Another factor to consider is that public health campaigners should also make use of significant COVID-19 events (such as big outbreaks or supply chain delays due to employees catching COVID-19) to promote and encourage communities to act on preventative measures like wearing masks and getting vaccinated (Wang et al., 2021). Aligning health promotion with such events can prove to be more effective as messages of preventative measures are communicated during the time it is most relevant and as emotions run high. During such critical times, it is easier for communities to recognise the importance of preventative measures and to take them on as they experience them in real-time. For example, people that experience a COVID-19 outbreak in their workplace or school would recognise the urgency of protecting themselves because they are being personally impacted by it. The peaks of the outbreak would naturally capture people’s interest on social media, thus making it optimal for public health campaigners to maximise their reach by frequently pushing relevant information out – answering people’s questions and concerns (Cinelli et al., 2020).

 

Wang et al. (2021)’s American study found that the public’s perspectives on mask-wearing and vaccinations were affected by the news and its discussions on infection rates, public policies and health recommendations. When the U.S. government talked about opening its economy and schools, two things resulted from it: an increased level of mask-wearing discussions and negative reactions (Wang et al. 2021). Even when COVID-19 death rates rose, people’s reactions did not follow but were instead impacted more by news coverage (Wang et al. 2021). When the words “impacted” and “influenced” are used, they are not just describing the influencing of people’s perceptions or attitudes, but also their actions. In 2020, CNN published an article that anticipated the military-backed lockdown of Lombardy, a region in Italy (Donato et al., 2020). Despite not having official confirmation from the government at that time, this assumption caused overcrowding of airports and trains as locals acted on the news and tried leaving the area before the perceived lockdown took place. Instead of the government announcing the lockdown to prevent the spread of COVID-19, the panic caused by the news article did the opposite – masses of people overcrowded spaces, making it perfect for a coronavirus outbreak (Cinelli et al., 2020). This chaos resulted from the voice of a news channel that did not have all the facts on hand before publishing – something that could have only caused such waves of panicked movement because of the fear-driven shares of the article on social media. The more traction the article received through likes, comments and shares, the more people it reached through algorithms (Cinelli et al., 2020).

 

Wang et al. (2021) found that Twitter users were more likely to respond to and share negative tweets concerning vaccinations – a topic largely surrounded by uncertainty. On the other hand, when the topic came to mask-wearing (a less uncertain topic), positive tweets were engaged more. Mask-wearing has even become a motivation for people to get creative and to share their ideas on DIY and personalised masks through social media – opening dialogue to making more environmentally friendly masks, learning to cut face mask ear straps and disposing of masks properly to protect marine life (Toliver, 2020) and making masks with clear screens for hearing impaired people who rely on lip-reading to communicate with others (Wang et al., 2021). Amid COVID-19, people have become mass educated on environmental and even health considerations of those who face greater implications with vaccination and mask mandates – such as people with complex health histories who may have to be more cautious with the risks of the vaccine to their bodies and as mentioned, also hearing-impaired people. Such growth in common knowledge shows the power of being in a community on social media and hearing each other’s perspectives.

 

So, while there are drawbacks to social media and the oversaturation of information and opinions, there is also a beauty to it, in that it is the easiest way people can unite regardless of background or geographical location. We see snippets of that beauty, especially during world events and disasters – such as the war in Ukraine. For example, entrepreneurs in Poland have utilised and deployed hundreds of 3D printers to print out various war equipment. The transportable printers are incredibly helpful to utilise during wars because of their small size they can be set up anywhere and equipment designs can be shared between printers. Such a resolution solves the issue of having to supply actual parts amid war zones. 10,000 parts have since been supplied, including protective equipment, periscopes and even drones that have been 3D printed in various parts of Ukraine at the fraction of the cost. This comes as a result of a handful of Polish tech companies combining their efforts and thinking outside of the box, all whilst utilising the Internet (Feldman, 2022).

 

Contrastingly, the hyperawareness of spreading misinformation can also cause anxiousness and a hesitancy to share information even from trusted sources for fear of being wrong and risking one’s reputation (Altay et al., 2020). People have shown declining trust and increased scepticism at sources that have shared inaccurate information previously, demonstrating the human tendency to become more cautious after something goes wrong (Altay et al., 2020). Such cautiousness has shaped online users to consume digital content responsibly as they not only value the accuracy of information highly but also their reputations. The human desire to do good, be seen as a good person and as a trusted source of information motivates users to be responsible for what they consume and to avoid sharing fake news (Altay et al., 2020). A study found that only 0.1% of Twitter users were responsible for 80% of the misinformation shared (Altay et al., 2020). It also found that organisations sharing false information were more likely to be known for it, such as tabloids – which hold a similar level of trust to that of junk mail. In comparison, sources such as The New York Times were less likely to share fake news as it holds a reputation for being a trustworthy news organization (Altay et al., 2020). Between tabloid magazines and The New York Times, the tabloids have less to lose when it comes to sharing fake news as they do not hold as high of a reputation as The New York Times.

 

While it is easy to criticise algorithms and popular trends, one has to acknowledge that our world has become so heavily dependent on those aspects of our digital lives that it would be frustrating to live without them. Having social media feeds giving you randomised content that you do not relate to would be irritating almost, as you no longer have instant access to all the content you could enjoy. We have become accustomed to getting everything at our fingertips, therefore having little to no patience for anything that falls outside our preferences. There can be greater regulation and filtering of misinformation, but it can never prevent the spread of it entirely. As a result, COVID-19 misinformation is left to influence the attitudes and decisions of the people it reaches. The misinformation is amplified by algorithms based on how much engagement it receives (Cinelli et al., 2020). However, the hope is that as people become more educated in deciphering between what is true and what is not, the spread of misinformation can have slightly less of a reckless effect on the public. Public health campaigners and policymakers should also learn to maximise the audience reach of social media by employing people experienced in producing content fit for these platforms. This should be done engagingly and refreshingly, not in a dry and outdated way – walls of information on a website with a stock picture or two do not cut it in today’s day and age.Liew_19477305_CaseStudy Revised

11 thoughts on “How Social Media Algorithms and Popular Trends Influence the Perception of COVID-19

  1. Andrea Dodo-Balu says:

    Nice to see your paper here Kaylee. You mention in your conclusion that you hope people will become more educated in deciphering what is true from what is not on social media. What so you see as an effective way of achieving this?
    Andrea

    • Kaylee Liew says:

      I used to think that educating people on where to find trusted sources for information such as government websites and health organisations would be an effective way of achieving I suppose social media literacy, but in the cases of Russia and the Ukraine war, that has shown to be the opposite of effective. It will depend quite highly on the context of the situation. In a non-war-threatened part of the world like Australia, social media literacy can look like educating people on be more sceptical of the information they come across. For example, reading about health treatments from a random blog rather than the health organisation is not generally a wise idea because the information is likely to be untrue.

      In the case of Russia and Ukraine, we see the opposite effect, where social media activists are the ones showing the true reality of the war but the Russian government displays a completely different image on their communication channels e.g. Vladimir Putin is broadcasted at the midnight Orthodox Easter service in the midst of war.

      • Andrea Dodo-Balu says:

        Thanks for your reply Kaylee. You have a great point – critical thinking and social media literacy needs to be applied to all sources on the internet, even government sources!

  2. Erica Lim says:

    Hi Kaylee,

    I enjoyed reading your paper. I liked how you discussed the pros and cons of algorithms and that while it might cause panic during COVID-19, it also allows us to share our knowledge with ease. I agree that we are so reliant on algorithms that it has become an important affordance on social media platforms. It can curate content for its specific user which is quite amazing! However, I wonder if privacy is now becoming an issue due to the vast amounts of information being collected in exchange for curated content? Regarding COVID-19 vaccination and masks, do you think that because of algorithms, certain news is only pushed out to a specific group of people, which strengthens their opinion on something? For example news about the consequences of vaccinations is only promoted to anti-vaxxers, which further accentuates their hate for vaccines?

    • Kaylee Liew says:

      Hi Erica,
      Thanks so much for reading my paper! That is a great observation you’ve pointed out. Come to think of it, that is actually very plausible for anti-vaxxers to continue to receive certain types of news suited to their liking because of the algorithm – which points out the big downside of algorithms pushing content we prefer. That would work both ways too, with pro-vaxxers only receiving information that supports their views. If people want a balanced perspective on anything, they would likely need to search for it themselves at the right sources.
      Great point!

      • Erica Lim says:

        Hi Kaylee,

        Thank you for your reply. That is very true. However, I think that because content like this is being pushed out to specific people, very few people actually search for credible sources which might contradict their views. Even if they did, they would have to read several articles that reject their views to be convinced that they are ill-informed.

        • Kaylee Liew says:

          Hi Erica, I definitely see your point. If one has already made their mind up about something and therefore approaches the situation or issue close-minded, no amount of articles could possibly change their mind.

          • Erica Lim says:

            Hi Kaylee,
            It is certainly very difficult to change the minds of people who are very close-minded. Thank you for a great discussion and a fascinating paper!

  3. Amber Dwyer says:

    Hi Kaylee,

    What an engaging read! I think online misinformation has been a prevalent concern for some time now but the current pandemic has definitely exasperated its spread and thus the damage it is eliciting. It’s so true that it’s always the bad news that makes the headlines over the good; even when looking at restaurant reviews, it seems for every 10 happy customers who wouldn’t think to write a positive review, the one unhappy customer is straight onto their laptop. This often creates a very biased and negative environment online that isn’t representative of the entire truth.

    I completely agree that accurate information is often displayed in a bland way, with a lack of graphics and colours, instead being a dull and very difficult-to-digest paragraph of small, black body text. In order to engage with more users, we do need to make this information much more attractive and palatable and there are some great initiatives doing just this! What I do wonder however, is what’s to stop spreaders of false information from employing this same tactic? I find more often than not, fake information is presented in a much more marketable way, presumably because it is compensating for the absence of facts and statistics. Although authentic information is often painful to read, at least its formal presentation is one way of differentiating it from less credible sources. I worry that if all information was made to look the same, will this just add more confusion to the mix and make it more difficult to distinguish the real from the fake?

    I also think that whilst health companies and initiatives have a huge role in projecting accurate information and making sure that it is being consumers by the greater public, disseminating all the false information circulation, is it up to these companies or more so does the accountability fall onto the actual platforms like Instagram and Facebook? Should they not have the responsibility of sifting through the content uploaded, taking down what is false and further promoting what is true? I think in an environment as powerful as the digital world, this is probably an issue that demands solutions from a multitude of teams but I’d love to hear your thoughts on where the accountability lies.

    This article also prompted me to consider that as you stated, people are becoming more and more reliant on social media as a source for entertainment, news etc. and I think its great that they can go to one place at the tip of their fingers for everything they need. What I worry is that much of the information published online, especially in the context of the pandemic, whether it be true or false, is very anxiety-provoking and distressing for many. Is it necessarily such a positive thing that people receive their news from social media? Does this take away users’ right to access a superficial, fun outlet where they can consume frivolous images and videos of the things and people they enjoy? I feel like every time I go onto social media, even if it’s just to watch a friend’s stories or see the latest trends on TikTok, I am constantly being bombarded with bad news and horrifying events. I think that social media begun as a very positive environment, but as you stated with negative news receiving more engagement than positive, it feels like the digital space is becoming increasingly dire and detrimental to our mental health. I’d love to hear your take and if you are passionate about this topic, I think you’ll find my paper quite interesting; it’s all about how social media has been weaponised by wellness warriors to spread misinformation within the wellness community, also touching on COVID! https://networkconference.netstudies.org/2022/csm/688/social-media-weaponised-in-the-wellness-community/

    • Kaylee Liew says:

      Hi Amber,
      What a detailed and in-depth response! You make a good point that it could potentially be a downfall to have factual information presented in the same marketable way as false or unfounded information. In a way, that could pose more problems as users have to try even harder to distinguish between what is factually sound or not, which is the opposite of what you would want. As an organisation communicating to the masses, you would want your message to be as easily digestible as possible, because, in a day and age where we would rather watch reels or TikToks that are usually 30 seconds or less, we really don’t have the attention span to read long blocks of text or even a quarter of it.

      Great question on accountability and who it falls on the most. I do think that the platforms themselves have an overarching responsibility to regulate content, as they already do with spam accounts, inappropriate content and now more commonly, covid misinformation. I think something as simple as a verified tick can visually tell users when a source of information is reliable, though it would be quite the process to begin rolling in when you think about the sheer level of content that is uploaded to Instagram, Facebook, Twitter and even TikTok on a daily basis. It would probably require the HQs of these platforms to expand their teams to help verify things.

      I also agree with your last paragraph. at times, I have also become quite overwhelmed with all the chaos unfolding around the world and as someone who previously had to keep up with the news for a previous journalism unit, it definitely drove me to tune news out entirely because I just did not want to hear about everything that was wrong with the world. After a solid break, now I find I like to read some news articles here and there if the headline sparks my interest – even if it is sad. While I don’t think it is entirely possible to tune all negativity out (which in a way, can be unhealthy if taken to the extreme), not following news outlets on your social media accounts may be a simple way to at least reduce the stressful news. Something I realised a while ago is that if something is significant enough, I’ll hear about it eventually – whether if it would be from a friend or family member or online somehow (this was when I was taking that journalism unit and I was feeling overwhelmed by how many things were going on at the same time).

      Thank you so much for linking your paper, I am reading it now and I am enjoying it!

  4. Radib Ahmed says:

    Hi Kaylee Liew,
    This paper was fantastic! It’s well-written and covers a wide range of topics and concepts that I would not have associated with the covid-19 epidemic. I was particularly interested in the subject of ubiquitous awareness since I’ve been fascinated by the issue of para-social connections on social media platforms. I particularly enjoyed the talk on algorithm bias and how our news feeds can create a bubble in which we are provided information that just reinforces our confirmation bias.
    -Radib

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>