Social networking platforms facilitate the creation of networks across borders and cultures, but this highly interconnected world is vulnerable to manipulative business practices such as emotional contagion in favour of growth and success. This paper identifies and examines the potentail reasons Facebook are employing mass-scale emotional contagion to manipulate users on its platforms for monetary gain. The paper concludes that it is likely given the context that Facebook have a history of unethical business strategy and are optimising their user’s behaviours to increase profits from channels such as advertising.
Our digitally interconnected world grows closer as more of the population jumps online to donate their data, frequently in the form of content; content which by virtue of consisting of data, typically brings more value to the companies collecting it, than the people engaging with it. This paper will examine how the convergence of data science and psychology has allowed for mass-scale emotional contagion on social networking platforms, and we theorise whether platforms like Facebook are manipulating networks of users with these techniques for monetary gain. In further detail, this paper will discuss how this pre-internet pyschological theory has the potential to be employed by technology companies to analyse and optimise the data collection for the advertising revenue model being utilised. Although social media platforms facilitate the formation of strong networks, the companies behind them can utilise these networks in conjunction with manipulative and unethical business strategies such as emotional contagion, to benefit their bottom line.
What Is Emotional Contagion?
Emotional contagion describes the transfer of positive and negative moods of one person to another (Kramer, et al., 2014, 2) and is well established in laboratory experiments with in-person interactions (Fowler & Christakis, 2008, 1). A longitudinal study over 20 years suggests that longer-lasting moods such as happiness and depression can be transferred through networks as well (Kramer, Guillory & Hancock, 2014). This means that emotional transfer from person-to-person can take place without in-person interaction. The billions of people who use social networking platforms, like Facebook, are subjecting themselves to possibly long-lasting emotional disturbances as a result of their online interactions. The emotions of one friend are transferred to another as an idea was theorised in a study from 2012 in which 689,003 Facebook users were materially unaware they’d signed up as test subjects for an experiment to determine whether expressed emotional sentiment of a third-party (another user) could manipulate their behaviour through emotional contagion (Kramer, Guillory & Hancock, 2014). The study, conducted by Kramer, Guillory and Hancock, hypothesised correctly that emotional contagion can occur via text-based computer-mediated communication. It found that people’s emotional expressions in the form of posts on Facebook can predict their friends’ emotional expressions on Facebook in the days following initial exposure. The use-case of this test was enabled because a social network user engages most heavily with content in their ‘news feed’ which is entirely determined by the platform. Although preferential bias can vary across the digital landscape, algorithms created by social media platforms are typically used to rank content in a user’s news feed based on metrics which determine its relevance and popularity (engagement). If these algorithms are effective, engagement rate of the users increase, their overall experience increases and their attention is retained for longer periods (Wong, Liu & Chiang, 2016, 1-3).
The 2012 study focused on two questions; whether the exposure to consistent categorical emotions led users to change their own posting sentiments, and whether those posting behaviours were consistent with the expressions they were initially exposed to. The test subjects were split into two variable groups. One group was exposed to a bias of positive expressions by their friends within their news feed, and the other group was exposed to a negative bias of content from their friends in their feed. Facebook was able to control the emotional sentiment of content appearing in the test subjects’ news feeds because the news feed posts were categorised automatically based on an algorithm trained to spot positive or negative words as defined by the Linguistic Inquiry and Word Count software (LIWC2007). The study found that mass-scale emotional contagion is achievable by manipulating what a user interacts. The study found that there was a strong correlation between those subjects who were exposed to negative expressions posting their own negative-leaning content following exposure, and the opposite was true for those exposed to positive expressions from friends. Although Cornell University’s Institutional Review Board for Human Participant Research (IRB) approved the study (“Research with Human Participants | Cornell Research Services”, 2020), Kramer and his research colleagues did not gather informed consent from the subjects (Kleinsman, Buckley, 2015, 182). Wide-spread public scrutiny ensued as the research group appeared to have inadvertently uncovered a more sinister side of Facebook. With this in mind, we address the likelihood that Facebook is employing this technique for greater monetary reward.
The Facebook Business Model
In order for Facebook to benefit monetarily from emotional contagion, the behaviour of users on the platform must affect the business model at one of the crucial stages for success, which we can break down into a few large components. First, if the end-goal is to generate revenue for the company in the form of advertising from third-party companies marketing their products and services on the platform, Facebook must ensure that there are in fact users on the platform to market to. Secondly, we must assume that users are regularly engaging with their connected networks on the platform in order for the third-party advertisers to have any chance of reaching them. If Facebook has ‘x’ users and ninety-five per cent of them are not regularly engaging with the platform, advertiser’s marketing efficiency or ‘marketing return on investment’ (MROI) will decrease. A decrease in MROI describes a scenario whereby a marketing team’s promotional spend becomes less efficient (Gallo, 2017). The platform must pulse user sentiment to ensure their experience is not diluted by adverts to such a degree that the user is discouraged from using the platform. This can be referred to as user experience design (UX). A user whose ‘news feed’ is inundated with advertisements, is unlikely to remain an engaged user on the platform, as they’re ‘put-off’ by poor UX. Social networking platforms like Facebook must therefore find a balance between boosting revenue in the short-term at the risk of providing a poor UX, and wasting an opportunity to increase revenue without detracting from UX before the user’s breaking-point. Third and finally, Facebook must provide the tools necessary for advertising partners to target and engage with potential customers effectively. Powerful advertising tools can increase marketing efficiency and MROI by tailoring adverts to a specific target audience, which encourages advertisers to increase budget allocation through the channel (Tarran, 2018).
Advertising partners are the key for Facebook to generate profits and grow as a company. An advertiser on a digital platform can be described as any person or organisation which stands to benefit in any way by reaching certain demographics of the public. They deliver value in ‘order to generate profitable and sustainable revenue streams’ (Yang, Kim & Dhalwani, n.d.). The ‘advertiser’ has several ways to improve their efficiency, including; media mix modelling and price modelling, but we’re going to focus on user targeting (Xie & Liu, 2018). User targeting is the ability to reach the right demographic at the right time for improved marketing efficiency with the help of segmentation tools, recommender systems and prediction modelling for conversion. An advertiser must be displaying adverts to potential customers, not users which are outside of their target demographic. Simultaneously though, the advert must be shown at the right time to encourage purchase. (Xie, Liu, 2018, 1). It is clear given the amount of revenue generated from advertising on Facebook and its subsidiaries such as Instagram, this component of the business model has proved astronomically successful. The targeting tools provided by Facebook for advertising partners has encouraged them back to the platform for deployment of bigger and more elaborate budgets and campaigns for promotion. In 2019 Facebook generated $70.1b USD in revenue, with $69.7b USD of that generated from digital advertising. This figure is up from $55.8b USD in 2018, and $40.6b USD in 2018. The strong growth in digital advertising revenue year-on-year correlates strongly with Facebook’s user base growth (“Facebook Reports Fourth Quarter and Full Year 2019 Results”, 2019). With a base understanding of Facebook’s business model and its success over the last decade in particular, we must next determine whether Facebook would have been potentially benefiting from mass-scale emotional contagion.
The Benefits To Facebook Of Effective Emotional Contagion
Understanding this business model informs us that Facebook must keep users on the platform in order to generate digital advertising revenue. If Facebook has resolved that they’re able to control the emotional sentiment of a user’s news feeds such that they’re attention and engagement rates are increased, it’s entirely possible they’re doing so for greater monetary reward with little regard for public health. Andrew Kramer, who designed the infamous 2012 study, stated publicly that ‘we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook’ (D’Onfro, 2014). This declaration from the person who designed the study, applied for and was approved the funding through Cornell University, the University of California, San Francisco (UCSF) on behalf of Facebook’s research term suggests the company had a motive to understand and then manipulate user behaviour for their own benefit. Studies have shown that teenagers who spent a week off Facebook were happier than a test group which continued their average daily usage of the social media site (Twenge, Joiner, Rogers & Martin, 2018). The business paradox being experienced by Facebook reads that at some point, a company’s need to make money will be at odds with its desire to do good. Facebook’s goal to ‘bring the world closer together’ (Facebook – Resources, 2019) is at odds with shareholder pressure and c-suite greed for increased profits. This declaration from the research team at Facebook may suggest an ulterior motive drove the research, namely, profits and suggests Facebook has built a platform capable of optimising behaviour through user interface design (UI) to increase engagement.
User Interface/User Experience Design and Optimisation
Digitally inclined companies around the world have at their disposal, tools which can assist data scientists in optimising both UI/UX to increase profits by both reducing costs or increasing revenue. Facebook employs large teams of data scientists to optimise its platform for profit (“Gaining insights to deliver meaningful social interactions”, n.d.). The precedence has therefore been set that Facebook are willing to manipulate their users in order to eek out greater profit margins. Data scientists in Facebook’s Core Data Science (CDS) team use resources at their disposal to engage users through various channels (“Core Data Science (CDS) – Facebook Research”, n.d.) If a user isn’t on the platform and scrolling, the sole priority of these teams is to entice the user back onto the platform, and then secondarily to keep them there. This two-part goal is achieved through psychological tricks founded on B. F. Skinner’s 1930 Harvard study about the ‘Reinforcement Theory of Motivation’, tricks which users are typically unaware of (leslie, 2016). ‘Reinforcement Theory of Motivation’ describes ‘specifically how people learn behavior and learn how to act’. Amongst other human behavioural psychology tricks, Facebook uses validation, variable rewards and haptic output and investment theory and the in-app infinite-scroll design to eliminate pagination and thus reduce engagement friction less interruption (Amutan, 2014, 682).
To achieve an optimally efficient output of these two commonly used techniques, a platform must optimise rudimentary ideas which have the potential to lead to increased engagement. Facebook has been utilising optimisation tools such as A/B testing, which pits one variable against another, to determine which performed optimally per the goal of the test. For example, if a Facebook have a goal to increase click-through rate (CTR) from a certain class of notification, an A/B test could determine which copy performs optimally against the other by attracting more users back onto the platform. By running this simple test through millions of interactions, Facebook is able to accurately analyse their user’s preferences and optimise how regularly they return to the platform (Benbunan-Fich, 2017, 20). An increase in user engagement results in greater average screen time in applications, which in turn results in more display advert views, and thus greater revenue from advertising.
Unethical Business Strategy Precedence
Facebook’s founder, Mark Zuckerberg, has been accused previously of unethical business strategy, first in alledgedly copying the idea for the platform off two Harvard colleagues, the Winklevoss brothers (Ionescu, 2011), and more recently, along with other executives at the company, for favouring revenue over the best interest of users (Kleinsman & Buckley, 2015). One such example can be seen in the 2016 US presidential election campaign. Clear abuse of the powerful targeting tools on the platform were employed by foregin agents to influence the outcome of the 2016 presidential election (Mueller, III, 2019). In the fall-out of this ground-breaking realisation that the fate of a country’s leadership spill could be affected by purchasing a relatively low-cost amount of digital advertising space, Facebook has shockingly affirmed their position for the current 2020 election race, stating they will not be removing politically motivated content purchased by campaigners in the lead up to voting in November of 2020 (Millman, 2019). Zuckerberg is on the record stating ‘people should be able to judge for themselves the character of politicians. The contentious point of this stance is the apparent swarm of ‘fake news’ on the site by advertisers looking to benefit without concern for the real-world implications of their actions (Figueira & Oliveira, 2017). Multiple foregin agents whether operating exclusively for personal gain, or as part of larger governmental departments are tipped to attempt some sort of interference in the run-up to the 2020 presidential election (McFaul, 2019, 18-20). Facebook’s non-action again questions the moral compass of its executive team.
Facebook and its subsidiaries facilitate billions of interactions each day around the world. As an entity it has the power to tweak and adjust the platform however the executive team deems necessary, confident that users have never read, nor will ever read the terms of service which state they’re handing control to Facebook and its researchers for participation in research. Facebook has proved time and again it’s capable of optmising the platform for monetary benefit through strong recommendation and algorithmic targeting for advertising partners, which have driven profits at nearly unprecedented rates in the last decade. Zuckerberg and the executive team have set a precedence of making unethical business decisions in favour of profits, and finally the research team at Facebook prove that users are materially unaware of the power they give to the company by way of manipulating behaviour and emotions. Given the context of these discussions, it’s evident and likely that Facebook is utilising emotional manipulation techniques of its users on a mass-scale in order to optimise revenue generation from digital advertising.
Amutan, K. (2014). A Review of B. F. Skinner’s ‘Reinforcement Theory of Motivation. International Journal Of Research In Education Methodology, 5(3), 682. Retrieved from https://www.researchgate.net/publication/306091479_A_Review_of_B_F_Skinner’s_’Reinforcement_Theory_of_Motivation
Benbunan-Fich, R. (2017). The ethics of online research with unsuspecting users: From A/B testing to C/D experimentation. Research Ethics, 13(3–4), 200–218. https://doi.org/10.1177/1747016116680664
Core Data Science (CDS) – Facebook Research. Retrieved 1 April 2020, from https://research.fb.com/teams/core-data-science/
Corrigendum: Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time. (2019). Clinical Psychological Science, 7(2), 397–397. https://doi.org/10.1177/2167702618824060
D’Onfro, J. (2014). Facebook Researcher Responds To Backlash Against ‘Creepy’ Mood Manipulation Study. Retrieved 1 April 2020, from https://www.businessinsider.com.au/adam-kramer-facebook-mood-manipulation-2014-6?r=US&IR=T
Facebook Reports Fourth Quarter and Full Year 2019 Results. (2019). Retrieved 3 April 2020, from https://investor.fb.com/resources/default.aspx
Figueira, Á., & Oliveira, L. (2017). The current state of fake news: challenges and opportunities. Procedia Computer Science, 121, 817-825. doi: 10.1016/j.procs.2017.11.106
Fowler, J., & Christakis, N. (2008). Dynamic spread of happiness in a large social network: longitudinal analysis over 20 years in the Framingham Heart Study. BMJ, 337(dec04 2), a2338-a2338. doi: 10.1136/bmj.a2338
Gaining insights to deliver meaningful social interactions. Retrieved 1 April 2020, from https://research.fb.com/category/data-science/
Gallo, A. (2017). A Refresher on Marketing ROI. Retrieved 3 April 2020, from https://hbr.org/2017/07/a-refresher-on-marketing-roi
Ionescu, D. (2011). Winklevoss Twins V. Facebook: Case Closed. Retrieved 1 April 2020, from https://www.cio.com/article/2409242/winklevoss-twins-v–facebook–case-closed.html
Kleinsman, J., & Buckley, S. (2015). Facebook Study: A Little Bit Unethical But Worth It?. Journal Of Bioethical Inquiry, 12(2), 179-182. doi: 10.1007/s11673-015-9621-0
Kramer, A., Guillory, J., & Hancock, J. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings Of The National Academy Of Sciences, 111(24), 8788-8790. doi: 10.1073/pnas.1320040111
McFaul, M. (2019). Securing American Elections (pp. 18-20). Stanford: Stanford University. Retrieved from https://cyber.fsi.stanford.edu/securing-our-cyber-future
Milman, O. (2019). Defiant Mark Zuckerberg defends Facebook policy to allow false ads. Retrieved 1 April 2020, from https://www.theguardian.com/technology/2019/dec/02/mark-zuckerberg-facebook-policy-fake-ads
Mueller, III, R. (2019). Report On The Investigation Into Russian Interference In The 2016 Presidential Election(pp. 24-25). Washington, D.C.: U.S. Department of Justice. Retrieved from https://www.justice.gov/storage/report.pdf
Research with Human Participants | Cornell Research Services. (2020). Retrieved 3 April 2020, from https://researchservices.cornell.edu/compliance/human-research
Tarran, B. (2018). What can we learn from the Facebook-Cambridge Analytica scandal?. Significance, 15(3), 4-5. doi: 10.1111/j.1740-9713.2018.01139.x
Twenge, J. M., Joiner, T. E., Rogers, M. L., & Martin, G. N. (2018). Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time. Clinical Psychological Science, 6(1), 3–17. https://doi.org/10.1177/2167702617723376
Wong, F., Liu, Z., & Chiang, M. (2016). On the Efficiency of Social Recommender Networks. IEEE/ACM Transactions On Networking, 24(4), 2512-2524. doi: 10.1109/tnet.2015.2475616
Xie, Z., & Liu, Y. (2018). User-level incremental conversion ranking without A/B testing. Journal of Marketing Analytics, 6(2), 62-68. doi:http://dx.doi.org.dbgw.lis.curtin.edu.au/10.1057/s41270-018-0031-0