Navigating the Minefield: The Proliferation of Misinformation on Facebook
Abstract:
The paper examines the transformation of Facebook into a breeding ground for misinformation, analysing its impacts on public discourse, democracy, and individual beliefs. Through a combination of personalised algorithms, lack of content moderation, and the psychological underpinnings of social media interaction, this study highlights the dangerous trajectory of misinformation spread on the platform.
Introduction:
Facebook’s evolution as a social media giant is a story of remarkable growth, innovation, and controversy. founded in 2004 by Mark Zuckerberg and his four college roommates eduardo saverin, andrew mccollum, dustin moskovitz, and chris hughes at harvard university, “The Facebook” initially served as a social network exclusively for harvard students. Its popularity quickly surged, and it expanded to other ivy league universities and eventually to colleges nationwide (Brügger, 2015).
With its constant evolution throughout the years, Facebook’s role in modern communication and information dissemination plays a pivotal role in shaping how individuals, communities and organisations interact and share information. As a digital platform that provides an ease of content sharing and communication, it has become integral to daily life, influencing everything from personal relationships to global events.
Despite its origins as a platform designed to connect friends and family and its potential for positive community building and information sharing, Facebook has faced increasing scrutiny and become a dangerous platform for the spread of misinformation, not only affecting individual users but public perceptions on complex issues around the world.
Section 1: The Mechanisms of Misinformation Spread on Facebook
Facebook’s algorithms, designed to maximize user engagement, play a crucial role in shaping the information landscape on the platform. By prioritizing content that is likely to generate interaction, such as likes, shares, and comments, these algorithms can inadvertently favour sensationalist, divisive, or even inaccurate information over more nuanced or accurate reporting. This prioritization of engagement over accuracy has significant implications, such as, the rise of sensationalism, formation of echo chambers and the unnecessary psychological manipulations.
Engagement based algorithms:
Facebook’s algorithms are sophisticated systems that determine what content appears in a user’s News Feed. Historically, these algorithms have been optimized to keep users on the platform as long as possible by showing them content that is most likely to capture their attention and provoke a reaction. While this approach maximizes engagement, it does not necessarily align with the goal of promoting accurate or reliable information.
The Rise OF Sensationalism:
Sensationalism has been a component embedded in media for some time, but the definition of has evolved over the years, but its context has always been to appeal to human emotion, thought to inspire “unwholesome emotional responses, stimulate empathy and evoke curiosity” (Brown et al., 2016). The digital space being as vast as it is, the ease of access to this sensationalist content is easier than ever. This content that evokes strong emotional reactions, be it outrage, fear, or amusement, is more likely to be shared and commented on. Sensationalist or misleading content can receive unneeded visibility on Facebook, regardless of its factual accuracy. This dynamic encourages content creators to produce more of such content, further exacerbating the problem.
Formation echo chambers (Groups):
“An echo chamber can act as a mechanism to reinforce an existing opinion within a group and, as a result, move the entire group toward more extreme positions” (Cinelli et al., 2021). Facebook’s algorithms contribute to this concept in two ways:
- Personalization: The platform personalizes each user’s News Feed based on their past interactions, showing them content similar to what they have liked or engaged with before. This personalization means users are more likely to see content that reinforces their preconceptions.
- Group and Page Recommendations: Many Facebook groups are either private or have selective membership criteria, creating a homogeneous user base. Similarly, pages often cater to specific interests or viewpoints. This selectivity ensures that members or followers are likely to share similar perspectives, reducing the diversity of opinions and information presented.
The Impact of Echo Chambers (Groups):
Echo chambers can have several harmful effects on society:
Division: The rampant spread of misinformation can incite societal divides, strengthen users in their beliefs and making them more resistant to alternative views.
Real-world Actions: Misinformation circulated in echo chambers can lead to real-world consequences, including public health risks (e.g., vaccine hesitancy), political unrest, or even violence.
Distrust in Institutions: Persistent exposure to misinformation can erode trust in established institutions, including the media, scientific community, and government, as these entities’ narratives often conflict with those prevalent within echo chambers.
Cognitive Biases (Psychological aspects):
The spread and acceptance of misinformation on platforms like Facebook are aided by psychological aspects such as cognitive biases. These biases are systematic patterns of deviation from norm or rationality in judgment, where individuals create their own “subjective reality” from their perception of the input.
The following examples discuss some common cognitive biases we see in the spread of misinformation (Mitchell, 2021):
- Confirmation bias – only finding or seeing data that confirms one’s preconceived notions.
- On social media, users are more likely to engage with and share information that aligns with their views, regardless of its accuracy. This bias is particularly potent in echo chambers, where users are exposed to a similar set of opinions and facts that reinforce their existing beliefs.
- Semmelweis effect – The tendency to reject evidence if it contradicts with a current paradigm.
- This explains why certain individuals or groups may resist factual information that contradicts their preexisting beliefs or misinformation they have accepted as truth.
- Bandwagon effect – The tendency to do or think things based on other people doing or thinking the same.
- Misinformation can spread rapidly on social media as users see others sharing a piece of information and decide to share it themselves, assuming that the popularity of the content is an indicator of its truthfulness.
While Facebook has the potential to serve as a powerful tool for community building and information sharing, its role in facilitating the spread of misinformation represents a significant threat to society. Tackling these issues is essential for ensuring the platform contributes positively to society and fostering informed public discourse. The path forward involves not only technological and policy solutions but also a collective effort to promote critical media literacy and a shared commitment to truth and accuracy in the digital age.
Section 2: The Impact of Misinformation on Society
Political Division
The impact of misinformation on society is profound and multifaceted, with one of the most significant effects being political polarization. Misinformation can deepen divisions within societies, heighten tensions, and undermine the democratic process, often involving exaggerated, sensationalized, or entirely fabricated narratives that appeal to emotions rather than facts. Such content is more likely to be shared and can amplify extremist views, making them appear more mainstream than they actually are.
President Donald Trump’s Presidency and electoral campaign is a great example of how spreading misinformation can create division and radicalisation. Trump has been accused of amplifying misinformation through his social media platforms. The Washington Post (2021) allege Trump told over Thirty-thousand false or misleading claims during his presidency. This includes sharing unfounded claims, conspiracy theories, and misleading information on a range of topics from political opponents to public health measures. His position as President gave his statements significant weight, leading to widespread dissemination and acceptance of misinformation among his supporters.
Roberts-Ingleson & Mcann (2003) claim “Misinformation is designed to elicit strong emotions and legitimise extreme beliefs, including propaganda and conspiracy theories.” They go on to explain what propaganda is, being “False or Misleading information” designed to exploit and influence its “consumer’s beliefs and preferences to achieve a political goal” (BBC, 2021). This quote can be seen in a majority of President Trumps presidency.
One of his most dangerous misleading claims had to be his reaction to his loss in the 2020 election, such as, “We won this election, and we won it by a landslide”, put doubt into his supporters’ minds. This paired with the proclamation “We will never give up. We will never concede. It doesn’t happen” and the call for action “We are going to the capitol” spread online like wildfire.
Although it wasn’t said on Facebook, with over 2.9 billion users at the end of 2021 (Datareportal, 2023) it was very prominent on the platform. Welcomed by echo chambers that supported Trump, many believe these statements lead to the Capitol Riots.
Public Health
Facebook’s relationship with public health during the COVID-19 pandemic has faced significant scrutiny, particularly concerning the platform’s role in the spread of misinformation. This negative aspect has had profound implications for public health efforts, influencing behaviours and attitudes in ways that have sometimes undermined the public response to the pandemic.
A study from 2022 analysed Facebook data compromised of posts concerning Covid-19, from Facebook pages or groups with 95,000 members or 100,000 likes. The study collected data from March 8th, 2020 – May 1st, 2020 and in that time they found “13,437,700 posts to Facebook pages, and 6,577,307 posts to Facebook groups, containing keywords pertaining to COVID-19” (Broniatowski et al., 2022).
Within these posts, it was found that their contented related to Covid-19 were more likely to contain ‘more credible’ sources, although, the “not credible” sources were more often within Facebook Groups and “3.67 times more likely to share misinformation” (Broniatowski et al., 2022).
Despite efforts to control it, Facebook has been a hotbed for the spread of inaccurate ‘evidence & statistics’ about the pandemic. False information about the virus’s origins, prevention methods, treatments, and vaccines has furthered on the platform. Misleading posts have included dangerous health advice, conspiracy theories about the pandemic being a hoax, and misinformation about the safety and efficacy of vaccines. This widespread dissemination of incorrect information has posed a significant challenge to public health authorities trying to manage the pandemic response effectively.
Although Facebook’s spread of misinformation in regard to the pandemic has been in the spotlight in recent years, their relationship with public health has faced challenges beyond the specific context of COVID-19, highlighting broader issues with misinformation that can negatively impact public health outcomes. For example, the impact on mental health, particularly among teenagers and young adults that Facebook contribute too. The spread of unrealistic body image standards and the intense social comparison facilitated by curated posts can lead to issues like depression, anxiety, and body dysmorphia. Although not misinformation in the traditional sense, the presentation and promotion of unattainable lifestyles can be misleading and have a tangible negative impact on mental health.
Section 2: Final Thoughts
The mechanisms of misinformation spread on Facebook are complex and multifaceted, leveraging the platform’s vast reach and the natural tendencies of users to engage with sensational content. Sensationalism, echo chambers, and cognitive biases, significantly accelerate the spread of false information.
As misinformation proliferates, its impact on society becomes increasingly detrimental, eroding trust in institutions, exacerbating divisions, and compromising public health and safety. Sensational content grabs attention and encourages sharing, while echo chambers reinforce existing beliefs and insulate users from contradictory viewpoints. Cognitive biases skew perception and judgment, making misinformation more appealing and harder to counteract.
Addressing these challenges is crucial for maintaining societal cohesion and ensuring the health and safety of the public. Tackling the spread of misinformation on Facebook requires a joint effort from all involved, from the platforms themselves, users, organisations, and educators. This situation demands a comprehensive approach involving stricter regulatory frameworks, improved digital literacy education, and healthy community engagement strategies.
References:
- BBC. (2021, February 14). Capitol riots: Did Trump’s words at rally incite violence?. https://www.bbc.com/news/world-us-canada-55640437
- Broniatowski D. A., Kerchner, D., Farooq, F., Huang, X., Jamison, A. M., Dredze, M., Quinn, S. C., & Ayers, J. W. (2022). Twitter and Facebook posts about COVID-19 are less likely to spread misinformation compared to other health topics. PLoS One, 17(1). doi: 10.1371/journal.pone.0261768
- Brown, D. K., Harlow, S., García-Perdomo, V., & Salaverría, R. (2018). A new sensation? An international exploration of sensationalism and social media recommendations in online news publications. Journalism, 19(11), 1497-1516. https://doi.org/10.1177/1464884916683549
- Brügger, N. (2015). A brief history of Facebook as a media text: The development of an empty structure. First Monday. 20. 10.5210/fm.v20i5.5423 (Volume 20 number 5) https://www.researchgate.net/publication/277005030_A_brief_history_of_Facebook_as_a_media_text_The_development_of_an_empty_structure
- Cinelli, M. De Francisci Morales, G. Galeazzi, A. & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9). https://doi.org/10.1073/pnas.2023301118
- Datareportal. (2023, May 11). Facebook Users, Stats, Data & Trends. https://datareportal.com/essential-facebook-stats
- Kessler, K. Rizzo, S. & Kelly, M. (2021, January 24th) Trump’s false or misleading claims total 30,573 over 4 years. The Washington Post. https://www.washingtonpost.com/politics/2021/01/24/trumps-false-or-misleading-claims-total-30573-over-four-years/
- Roberts-Ingleson, E. & McCann, W. (2023) The Link between Misinformation and Radicalisation: Current Knowledge and Areas for Future Inquiry, Perspectives on Terrorism, 17(1), 37-48. https://www.jstor.org/stable/27209215
- Mitchell, R.J. (2021). Twenty-one Mental Models That Can Change Policing: A Framework for Using Data and Research for Overcoming Cognitive Bias (1st ed.). Routledge. https://doi.org/10.4324/9780367481520

Leave a Reply
You must be logged in to post a comment.