Content Warning: Racism
On October 14th 2023, the constitutional amendment proposed by the Voice to Parliament referendum, as outlined by the Uluru Statement from the Heart (2017), was rejected with 60% of Australians voting ‘no’ (Berry, 2024). In an era of increased economic insecurity driven by rising interest rates and leading to a decline in living standards, the Voice referendum arrived at a time when the working class and “downwardly middle-class voters” were looking more at their own and less at the plight of others less fortunate (Berry, 2024, p. 243). Seeds of division were easily sown by center-right and far-right political parties who were “opposed to creating a ‘special status’ for First Nations’ peoples” (Berry, 2024, p. 243) and who argued that the Voice would divide our nation whilst willfully ignoring pre-existing intersections of inequality such as ethnicity, class and gender. The word ‘division’ therefore featured prominently across the campaign trail, with the ‘No’ campaign purporting that the Voice would “divide Australians by race and rights” (Carson, 2024, p. 309). McAllister and Biddle (2024) suggest that the principle of equality is undermined by the widespread belief that Indigenous people receive ‘special benefits’ from the government that are not accessible to other Australians, and that it was these pre-existing structures of othering First Nations’ peoples which allowed disinformation along the lines of division to flourish. Disinformation is the intentional spread of inaccurate, false or misleading information designed for profit or to cause public harm by purveyors promoting ideological viewpoints which decreases trust in mainstream institutions, and recruits others to their cause (Freelon et al., 2020, p. 1199). Disinformation is spreadable across various social media platforms via their algorithmic affordances which have emerged out of a post-truth, participatory culture (Jenkins, 2006). ‘Spreadability’ in this context refers to the social networks that allow for the circulation of media texts and the technical affordances and economic structures which support the sharing of material which allows for disinformation to be discoverable (Jenkins et al., 2013). ‘Discoverability’ relates to the ways social media platforms coordinate users into finding specific pieces of content (Lobato, 2018; McKelvey & Hunt, 2019), and for promoters of disinformation, often act ‘below the radar’ to formulate a refracted public which “facilitates the dissemination of specific messages away from or toward target audiences” (Abidin, 2021, p. 10). Refracted publics construct spaces out of refracted perceptions and use platform affordances to manipulate vision and access which boosts content engagement. They achieve this by making their content discoverable via tactics of subversion such as hashtag jacking, and are silosocial by making content visible to specific subcommunities such as through ‘dogwhistling’ (Abidin, 2021). Refracted publics maximise algorithmic affordances to make their content impactful to audiences through the creation of fake accounts which shapes and boosts engagement, and weaponise contexts by intentionally remixing content which can be reappropriated to different audiences (Abidin, 2021). Whilst disinformation spread by the ‘No’ campaign was not the overarching reason for the referendum’s failure, emerging evidence suggests that it was efficient in upholding participatory troll behaviour and conspiracy theorising aimed at polarising people’s attitudes by arousing negative emotions via the political leaders, commentators and mainstream news agencies that amplified its discoverability (Carson, 2024; Berry, 2024; Graham, 2023).This paper will outline the various ways disinformation was used by the ‘No’ campaign leading up to the Voice referendum by demonstrating how a participatory culture of astroturfing, conspiracy theorising and dogwhistling created and sustained a refracted public to sway voter sentiment; and provide evidence for how platform affordances such as hashtag jacking and cross-platform sharing were deployed to weaponise contexts by these publics which amplified the discoverability of disinformation.
Disinformation campaigns coordinate inauthentic behaviour and attempt to cultivate organic online communities to spread their preferred narrative (Cover, 2022). The ‘No’ campaign achieved this through digital astroturfing: a form of sentiment seeding where media such as out-of-context headlines and quotations are manipulated to create a distorted version of events and softens public perception in order to refract it (Graham, 2023; Abidin, 2021). Facebook pages such as ‘Referendum News’ were thinly veiled, news-like sources sponsored by Advance: a conservative lobby group “claiming to represent ‘mainstream Australia’ by removing the far-left’s control” (Carson et al., 2024, p. 19) and who came into prominence during the 2019 election under ‘Advance Australia’. A digital astroturfing campaign was run on Facebook, with Advance controlling multiple pages that pushed highly emotive clickbait articles targeted at specific voter cohorts to spread their messages such as ‘Vote No to Division’. Clickbait is “strategically designed teasers intended to attract viewer attention, extend viewer interest, and facilitate viewer action” (Abidin, 2021, p. 9) often with contradictory messages used to provoke and exploit fear amongst different target demographics (Graham, 2023). Another example of astroturfing was when conservative politicians and their agribusiness backers reinforced an ungenerous and deeply racist attitude towards First Nations’ peoples by representing the Voice as “likely to lead to an attack on private property owned by non-Indigenous Australians” (Berry, 2024, p. 242). Assertions that the Voice would lead to a ‘land-grab’ were made by politicians such as One Nation senator Pauline Hanson, who stated “rivers and streams [will be] owned by local Aborigines, who will charge the rest of us for water consumption” (Graham, 2023, p. 9). The ’globalist land-grab’ rhetoric that evolved became a central part of the ‘No’ campaign and was fueled by online, anti-globalist conspiracy theorists who asserted the Voice would be a part of a “hidden elitist agenda to control resources and land” – a popular anti-Semitic, conspiratorial trope which attempts to make links between philanthropist George Soros who “secretly controls the US government and global financial markets” (Graham, 2023, p. 15; Freelon et al., 2020). Carson et al. (2024) suggests the disinformation cycle was perpetuated by US-style “participatory disinformation” (p. 36) in that audiences respond by sharing misleading, crowdsourced stories which is then picked up by mainstream media who recirculate and reinforce fake news narratives leading to its amplification. This became realised and participatory in nature when the ‘globalist’ theory was pushed by Opposition leader Peter Dutton, who ‘dogwhistled’ that the Voice “pandered to a ‘shadowy Canberran elite’ and that the lack of detail risks deception of the Australian public” (Graham, 2023, p.15). A dogwhistle “is a deliberate form of communication that carries two potential meanings” (Witten, 2014 in Graham, 2023, p. 10) where a general audience interprets the message as benign whilst the other refracted audience recognises the coded message for its intended meaning (Abidin, 2020). ‘Dogwhistling’ also occurred during the campaign by alt-right Covid conspiracy theorists such as Avi Yemeni, who suggested that Pfizer was backing the ‘Yes’ campaign and therefore asked his supporters not to. The ‘No’ campaign was also bolstered by Indigenous politicians such as Senator Jacinta Nampijinpa Price, who argued that “Aboriginal Australians were better off because of British colonisation” (Berry, 2024, p. 244) and who Coalition party leader’s Peter Dutton and David Littleproud could rally behind, spreading their message to those unfamiliar with Indigenous disadvantage by repeating the highly effective campaign slogan “if you don’t know, vote no” (p. 244). Marwick and Lewis (2017) call this ‘trading up the chain’, where extreme viewpoints or conspiracy theories are planted in a news outlet that may not fact-check it, such as Sky News. The story is then repeated by larger outlets or endorsed by prominent politicians, such as Price or Dutton, in order to manipulate the media and sway voter sentiment towards their cause. Through coordinated digital astroturfing, dogwhistling and conspiracy theorizing, the ‘No’ campaign illustrated the ways refracted publics use participatory culture to effectively manipulate media narratives and sway public voter sentiment.
The spread of disinformation is supported by the technical, structural, economic and content-based affordances of social media platforms which foster open spaces for like-minded individuals to share their beliefs, ideas and conspiracy theories amongst connected groups with a common purpose (Mihailidis & Viotty, 2017). Where searchability in networked publics refers to the ways content can be accessed through search (Boyd, 2010), a refracted public chances upon content via discoverability (Abidin, 2021). In order to make their content discoverable, purveyors of disinformation maximise platform affordances to surface their content to those in and outside of their user base (Abidin, 2021). This is achieved via algorithmic manipulation such as hashtag jacking — which increases engagement with disinformer’s desired messages to an unsuspecting public — or by using cross-platform affordances to share the same stories across multiple channels (Freelon et al., 2020). Carson et al. (2024) suggests that disinformation around the referendum’s integrity was the most destructive in that it “struck directly at trust in core political and social institutions” (p. 37). Comparative global research has demonstrated that disinformation feeds into voter distrust and has the potential to magnify anxieties surrounding election fairness. Distrust in the Australian Electoral Commission (AEC) was sown through the ‘No’ campaign’s assertion that individuals were able to cast multiple votes and was promoted through the hashtag jacking (Abidin, 2021) of #voteoften across multiple social media channels. This claim was also bolstered by Dutton, who asserted the process was ‘rigged’ and is an example of how refracted publics engaged with content surrounding voter trust and theories of election interference emerge. Katzenback and Ulbricht (2019) suggest algorithms favour networked collectives that are “already privileged while discriminating against marginalised people” (p. 7) due to bias data sets that reify classifications surrounding class, gender, and ethnicity, reflecting pre-existing power structures and routinely amplifying disinformation. Algorithms empower disinformation along these bias data sets by attempting to personalise people’s newsfeeds by herding users into artificially curated environments — or echo chambers — which confirm their preconceived views (Srivastava, 2023; Dutton et al., 2019). The ‘No’ campaign was faster and more adept at utilising social media’s cross-platform affordances on TikTok, X (the platform formerly known as Twitter), Facebook and Instagram in engaging with and herding undecided voters (Carson, 2024). The campaign used authentic voices and storytelling from Jacinta Nampijinpa Price and Nyunggai Warren Mundine to personalise its message to younger voters, and whose messages were then amplified by Sky News Australia via its reach through its YouTube channel. Stories on Sky News were then reposted back to social media platforms such as Reddit and via political commentators and conservative politicians who further sustained the amplification of disinformation through their X and Facebook accounts (Carson, 2024). Marwick and Lewis (2017) assert that misinformation crafted by media manipulators is more likely to stick, as it is impossible for mainstream media outlets to correct it once it has gained traction due to the information either confirming people’s pre-existing beliefs or that the false narrative is more compelling to share. Freelon et al. (2020) suggest right-wing commentators generally attract small audiences and therefore rely on mainstream media outlets to help circulate their extreme ideas to a broader ecosystem. ‘Referendum News’ achieved this through creating parallel literacies: where single pieces of content are deliberately encoded with a “kaleidoscope of messages” (Abidin, 2021, p. 8) in order for it to be interpreted in different ways by different audiences. It achieved this by leveraging content from mainstream news outlets such as news.com.au and the ABC in over 280 advertisements to amplify their concerns around the Voice. These themes centered around changing the date of Australia Day and claimed that big businesses were forcing their employees to vote ‘Yes’ (Carson et al., 2014). Where Boyd (2010) suggests collapsed contexts contribute to networked publics by making social context difficult to maintain, Abidin (2021) refers to this as ‘weaponised contexts’, where content is intentionally remixed to collapse distinct socio-cultural contexts. By exploiting platform affordances, refracted publics that emerged from the ‘No’ campaign weaponised various contexts, amplified distrust via parallel literacies and manipulated algorithms to circulate disinformation which eroded public confidence in democratic institutions such as the AEC.
This paper has argued that though the failure of the Voice referendum was not solely the result of disinformation, the impact the ‘No’ campaign had on deploying a participatory culture of misinformation dissemination via platform affordances and weaponised contexts cannot be understated. Through digital astroturfing, dogwhistling and engaging with conspiracy theories via prominent political figures, the ‘No’ campaign was able to deepen existing social divisions — a platform which the campaign majorly ran on — which created and sustained a refracted public. By manipulating the discoverability of disinformation through hashtag jacking and strategic cross-platform affordances, alt-right influencers and conservative politicians were able to sustain a campaign via parallel literacies which undermined confidence in both the referendum process and Indigenous sovereignty and determination.
References
Abidin, C. (2021). From “networked publics” to “refracted publics”: a companion framework for researching “below the radar” studies. Social Media + Society, 7. https://doi.org/10.1177/2056305120984458
Berry, M. (2024). The Voice referendum. The journal of Australian political economy. (92), 240-248. https://search.informit.org/doi/10.3316/informit.T2024031400017990928976236
Boyd d. (2010). Social network sites as networked publics: Affordances, dynamics, and implications. In Papacharissi Z. (Ed.), Networked self: Identity, community, and culture on social network sites (pp. 39–58). Taylor & Francis.
Carson, A., Evans, M., Strating, R., & Grömping, M. (2024). Voiceless: a multi-level analysis of the 2023 Voice to Parliament referendum outcome and its implications: an introduction. Australian Journal of Political Science, 59(3), 308–313. https://doi.org/10.1080/10361146.2024.2409075
Carson, A., R. Strating, S. Jackman, M. Grömping, P. Hayman, and T. B. Gravelle. (2024). Influencers and messages: analysing the 2023 Voice to parliament referendum campaign. Melbourne: La Trobe University. https://osf.io/8nqg2.
Cover, R. (2022). Fake news in digital cultures: technology, populism and digital misinformation. (A. Haw & J. D. Thompson, Eds.; First edition.). Emerald Publishing Limited.
Dutton, W,H,, Reisdorf, B.C., Blank, G., Dubois, E., & Fernandez, L. (2019). The internet and access to information about politics searching through filter bubbles, echo chambers, and disinformation. In M. Graham and W.H. Dutton (Eds). Society and the Internet: How Networks of Information and Communication are Changing Our Lives, 2nd edn. Oxford Academic. https://doi.org/10.1093/oso/9780198843498.001.0001
Freelon, D., Marwick, A., & Kreiss, D. (2020)False equivalencies: online activism from left to right. Science 369(3508),1197-1201. DOI:10.1126/science.abb2428
Graham, T. (2023). Understanding misinformation and media manipulation on Twitter during the Voice to parliament referendum. Digital media research centre and school of communication. QUT. https://osf.io/qu2fb/download
Jenkins, H. (2006). Confronting the challenges of participatory culture: media education for the 21st century (part one). Pop Junctions. https://henryjenkins.org/blog/2006/10/confronting_the_challenges_of.html
Jenkins, H., Ford, S., & Green, J. (2013). Spreadable media: Creating value and meaning in a networked culture. New York University Press.
Katzenbach, C., & Ulbricht, L. (2019). Algorithmic governance. Internet Policy Review, 8(4). https://doi.org/10.14763/2019.4.1424
Lobato, R. (2018). On discoverability. Flow. https://www.flowjournal.org/2018/05/on-discoverability/
Marwick, A., & Lewis, R. (2017). Media manipulation and disinformation online. New York: Data & Society Research Institute, 359, 1146-1151.
McAllister, I., & Biddle, N. (2024). Safety or change? The 2023 Australian voice referendum. Australian Journal of Political Science, 59(2), 141–160. https://doi.org/10.1080/10361146.2024.2351018
McKelvey, F., & Hunt, R. (2019). Discoverability: toward a definition of content discovery through platforms. Social Media + Society, 5(1), 205630511881918-. https://doi.org/10.1177/2056305118819188
Mihailidis, P., & Viotty, S. (2017). Spreadable spectacle in digital culture: civic expression, fake news, and the role of media literacies in “post-fact” society. The American Behavioral Scientist (Beverly Hills), 61(4), 441–454. https://doi.org/10.1177/0002764217701217
Srivastava, S. (2023). Algorithmic governance and the international politics of Big Tech. Perspectives on Politics, 21(3), 989–1000. doi:10.1017/S1537592721003145
Uluru Statement. (2017). Uluru statement from the heart. The Statement. https://ulurustatement.org/the-statement/view-the-statement/
Volcic, Z., & Andrejevic, M. (2022). Automated media and commercial populism. Cultural Studies, 37(1), 149–167. https://doi.org/10.1080/09502386.2022.2042581
Hi Shannon Kate, You’re right to ask; it is incredibly difficult to police these issues today. Predatory behaviour isn’t exclusive…