{"id":203,"date":"2019-04-27T20:34:34","date_gmt":"2019-04-27T12:34:34","guid":{"rendered":"https:\/\/networkconference.netstudies.org\/2019Open\/?p=203"},"modified":"2019-04-28T18:46:41","modified_gmt":"2019-04-28T10:46:41","slug":"what-role-did-social-media-play-in-the-christchurch-massacre","status":"publish","type":"post","link":"https:\/\/networkconference.netstudies.org\/2019Open\/2019\/04\/27\/what-role-did-social-media-play-in-the-christchurch-massacre\/","title":{"rendered":"What role did social media play in the Christchurch massacre?"},"content":{"rendered":"\n<div class=\"wp-block-file\"><a href=\"https:\/\/networkconference.netstudies.org\/2019Open\/wp-content\/uploads\/2019\/04\/PDF.pdf\">PDF<\/a><a href=\"https:\/\/networkconference.netstudies.org\/2019Open\/wp-content\/uploads\/2019\/04\/PDF.pdf\" class=\"wp-block-file__button\" download>Download<\/a><\/div>\n\n\n\n<p><strong>Stream &#8211; Social Networks<\/strong><\/p>\n\n\n\n<p><strong>Abstract:<\/strong> &nbsp;The Internet once represented an advancement in global democracy where the opportunity to share knowledge was no longer bound by limitations of personal experience. Instead, we have been siloed into our preconceived biases and ideology while social network algorithms create echo chambers that can result in extremist behaviours. A consideration of social medias role and responsibility in the Christchurch massacre. <\/p>\n\n\n\n<p>The advances of Web 2.0 and subsequent development of Social Network Sites (SNS\u2019s) such as Facebook, Twitter and Instagram had potential to be an active tool in the advancement of global democracy and an opportunity for international unification. The new medium could play host to diverse ideas and access would not be bound by the constraints of time, social status or geographic location. Politics has embraced SNSs, primarily social media as a communication tool, where everyday citizens are now able to locate information, receive and disseminate news, and mobilise citizens quickly for a cause (Hyun and Kim, 2015). Despite SNS\u2019s inclusive format, paradoxically they do not lend themselves to democracy and instead reinforce partisan politics and are a breeding ground for extremism. In this conference paper, I argue that <a>SNS platforms foster an environment of echo chambers that promote political and ideological polarisation, undermine democracy and facilitate extremism.<\/a><\/p>\n\n\n\n<p><strong>News Consumption and Dissemination<\/strong><\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p>\u201cSocial Media do not show you the world out there, they construct a world to your liking, and as such, they are a breeding ground for echo chambers, and constructions of filter bubbles where all like-minded people get together and reinforce their perception of the realities and priorities rather than engaging with other views. And, everybody assumes this is the world out there!\u201d&nbsp; Dr Majid Khosravinik<\/p><\/blockquote>\n\n\n\n<p>Unlike\nother web-based platforms such as email and message boards, social media sites\nbring together the personal and the political (Yu, 2016). Interactions happen\nin real time, simulating face to face conversation within established\nrelationships and networks. While most direct political discourse is initiated\nby individuals that are active participants in the political sphere, passive\nnews consumption and purposeful dissemination of information is the more\nprominent political based activity of most users participating with SNSs (Hyun\nand Kim, 2015). Political action is rarely an individual&#8217;s primary reason for\nusing social media, but instead a by-product of interacting with current\naffairs and incidental opinions on the platforms. No longer limited to the role\nof consumer, users of SNS\u2019s are now able to create news and share and link news\nitems within their networks. This new form of mass media has usurped the role\nof gate-keeper from the traditional mainstream media and utilises its\nnetworking capabilities through established trust between participants (Hyun\nand Kim, 2015). <\/p>\n\n\n\n<p>Passive\nnews consumption is initiated within social media feeds once a user consciously\nbegins a relationship with a preferred news outlet or journalist by liking or\nfollowing their page or profile, creating a continuous newsfeed and updates, similar\nto the act of purchasing or consuming news from traditional outlets of personal\nchoice. Purposeful dissemination of this news is akin to sharing something that\nyou read in the paper or viewed on the nightly news bulletin&nbsp; (Hyun and Kim, 2015), however rather than the\nlimitations of sharing news face to face with individuals, sharing on an SNS\ngrants exposure to a much larger audience with the potential for viral\npublicity. This virality has previously worked for democracy in the case of\nArab Spring, fundraising for KONY 2012, education in #blacklivesmatter and\nawareness in #metoo movements. However, on March 15, 2019, this virality was\nutilised to disseminate extreme right wing propaganda via a live stream video\nof a massacre in Christchurch New Zealand, accompanied by a manifesto of extremist\nideology by an individual that was possibly radicalised by the very same\nplatforms.<\/p>\n\n\n\n<p><strong>Algorithms, Echo Chambers and\nFilter Bubbles<\/strong><\/p>\n\n\n\n<p>Particularly\nevidenced within Facebook and YouTube, an individual\u2019s consumption and\ndissemination of news contributes data to their personal algorithm which in\nturn anticipates what the user would like to see next. This algorithm continues\nto dictate what information is made available to the user to create the optimal\nexperience each time they log in. Based on the data supplied, SNS algorithms not\nonly curate pleasing content for the user but also regulate exposure to opposing\nviews in newsfeeds to create a custom experience for the user free of differing\nideology. This action mimics human behaviour where individuals will also tend\nto consume media that supports their preconceived belief system. Known as selective\nexposure or confirmation bias, studies into this phenomenon when displayed on social\nmedia have shown that \u201ccompared with algorithmic ranking, individuals\u2019 choices\nplayed a stronger role in limiting exposure to cross-cutting content\u201d (Bakshy,\nMessing &amp; Adamic, 2015) and that moderates were more likely to be\nsusceptible to this type of behaviour (Spohr,\n2017). One issue with selective exposure is that the\nindividual is quite often unaware that this thought process is taking place as\nit is a subconscious mechanism employed by humans to meet our need for\nconsistency and as such cannot be easily altered. This trend is not limited to social\nmedia platforms; ideological selectivity also occurs with mainstream\ntelevision, newspaper and blogs (Kim, 2011)\nwhere \u201cliberals and conservatives inhabit different worlds. There is little\noverlap in the news sources they turn to and trust\u201d (Mitchell,\nMatsa, Gottfried &amp; Kiley, 2014). Media availability is determined by\neconomic viability \u2018Competition forces\nnewspapers to cater to the prejudices of their readers, and greater competition\ntypically results in more aggressive catering to such prejudices as competitors\nstrive to divide the market\u201d (Mullainathan &amp; Shleifer,\n2005). Further studies into selective exposure have shown that humans will also\nrecall information that fits their preconceived beliefs and hypothesis with\nmore accuracy and are often overconfident in their personal political\nunderstanding. <\/p>\n\n\n\n<p>When\nthese predispositions and self-surety are combined with the filtering capacity\nof SNSs and the algorithms employed to personalise the user&#8217;s experience, it can\ncreate a potentially dangerous silo of polarisation that can undermine democracy\n(Spohr, 2017). Curation of the online experience has raised concerns that\nindividuals now exist in an echo chamber or \u201cfilter bubbles\u201d (Pariser, 2012),\nwhere users are only exposed to likeminded content that reinforces their\nexisting position on political matters. These users are then able to form\ngroups of like-minded individuals that collectively will not engage with\nopposing ideas, reinforcing preconceived beliefs en masse and creating\npolitical polarisation (Spohr, 2017). \u201cEcho\nchambers are not about new ideas or (critical) perspectives, they are about how\nwell or effectively the group members reiterate the same idea\/belief\u201d\n(Khosravinik, 2017). The establishment of polarisation results in individuals\nno longer participating in the democratic process of informed voting but\ninstead using their vote to affirm allegiance to a party or group. Then can\ncreate an environment of othering where individuals place more worth towards\ntheir own group and regard people on the opposing side in an overly negative\nlight. Research has shown that individuals within these homogenous groups may\nalso adopt a more extreme stance within the group to advance their position (Spohr, 2017). If the individual&#8217;s views support\nnotions associated with racism, nationalism or discriminatory ideals, this\ncould also create an environment that nurtures violence, radicalism and\nextremism. The algorithms themselves do not necessarily point towards left or\nright ideology, however in some cases, such as YouTube will direct the user to\nmore extreme content. <br><\/p>\n\n\n\n<p><strong>Extreme Algorithms<\/strong><\/p>\n\n\n\n<p><a>Zeynep Tufekci<\/a>, Associate Professor at the UNC School of Information and Library Science and contributor to the New York Times, <a href=\"https:\/\/www.nytimes.com\/2018\/03\/10\/opinion\/sunday\/youtube-politics-radical.html\">wrote about this phenomenon<\/a> after recognising changes in her feed while researching the 2016 Presidential election. YouTube coverage of Donald Trump rallies would eventually lead to \u201cwhite supremacist rants, Holocaust denials and other disturbing content\u201d (Tufekci, 2018). Equally, creating another account that focused on Hillary Clinton and Bernie Sanders shaped an algorithm of left-wing extremism focusing on conspiracy theories. While Tufekci had started in moderate ground, the algorithm employed by the Google giant to keep people online for longer for monetary profit could push a slight leaning of preferences right into a rabbit hole of radicalism.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p>&#8220;It seems as if you are never &#8220;hard core&#8221; enough for YouTube&#8217;s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalising instruments of the 21st century.&#8221; Zeynep Tufekci<\/p><\/blockquote>\n\n\n\n<p><strong>Fake News and News Finds Me<\/strong><\/p>\n\n\n\n<p>Consumption of curated news has little effect outside of the individual\u2019s personal \u201cfilter bubble\u201d, however, the dissemination of siloed material that is not subject to fact-checking can result in viral misinformation, or as it is more commonly known as \u201cfake news\u201d.&nbsp; The term \u201cfake news\u201d has been overused in modern political discourse since the 2016 US presidential election. Its original use was as a descriptor for American late night television shows such as <em>The Late Show with Stephen Colbert<\/em> and <em>Saturday Night Live<\/em> which blur the lines between politics and comedy. Now the term is used habitually to oppose or silence differing political principles. Despite the term being popularised by President Donald Trump in the 2016 presidential election when describing mainstream media outlets, research after the election indicated that fake news stories stemming from Facebook consistently received more engagement than top stories from mainstream media outlets. These \u201cfake news\u201d articles were ironically also more favourable towards Donald Trump with \u201c115 pro-Trump fake articles being shared 30 million times compared to 41 pro-Clinton fake articles shared 7.6 million times\u201d <a>(Spohr, 2017). <\/a>Who should take responsibility for fake news is a point of contention with some accountability being placed upon those that control the technology (Howard, 2016), while those that own the technology, for instance, Mark Zuckerberg CEO of Facebook, are <a href=\"https:\/\/edition.cnn.com\/2019\/03\/30\/tech\/facebook-mark-zuckerberg-regulation\/index.html\">requesting tighter government regulation of the internet in order to stem the flow of fake news<\/a> (Wattles &amp; O&#8217;Sullivan, 2019). While this to and fro debate continued around statistics and election results, no one could anticipate what was going to unfold in Christchurch and what role social networks may have played. <\/p>\n\n\n\n<p>Google, Facebook and Twitter were put on notice in 2018 by the European Commission who have requested the tech giants remove extremist content within one hour or face fines, however, what is determined to count as extremism seemed to be a point of contention. While terrorist groups such as ISIS have been all but removed from the platforms in a joint blocking effort, a rise in right-wing extremism has been slowly growing while discussions as to whether white supremacy and white nationalism are the same while executives and policymakers sit on their hands. Facebook, in particular, has now changed its stance stating in a <a href=\"https:\/\/newsroom.fb.com\/news\/2019\/03\/standing-against-hate\/\">blog post<\/a> &#8220;white nationalism and white separatism cannot be meaningfully separated from white supremacy and organized hate groups&#8221; (&#8220;Standing Against Hate | Facebook Newsroom&#8221;, 2019) and since Christchurch, <a href=\"https:\/\/www.nbcnews.com\/tech\/tech-news\/facebook-bans-white-nationalism-after-pressure-civil-rights-groups-n987991\">has banned related content from the platform<\/a> delivering a strong message to hate groups (Ingram &amp; Collins, 2019). While this decision has been met with outrage and branded as hindering free speech, it is important to remember that Facebook is a private entity that can set its own policies and user agreements. However, at the same time, Facebook Inc (including Instagram) is unique in that it has to a degree a monopoly of the market and with that comes a responsibility to the public. As a publicly listed company their primary responsibility is to their shareholders, and to keep making money for their advertisers. Zuckerberg previously held the stance (as do most directors of SNS platforms) that social media sites are not responsible for user content as they are tech companies, not media companies. He has since backtracked on that position during a joint committee hearing of the United States Senate, conceding that Facebook Inc is responsible for the content, changing the dialogue around what the role SNSs play in today&#8217;s political landscape. Despite these good intentions <a href=\"https:\/\/motherboard.vice.com\/en_us\/article\/43jdbj\/christchurch-attack-videos-still-on-facebook-instagram\">Facebook still, a month later is hosting the content of the Christchurch video<\/a> as Facebook&#8217;s artificial intelligence struggles to keep up with individuals that modify and splice the content to avoid detection (Cox, 2019). <\/p>\n\n\n\n<p><strong>Who is in charge?<\/strong><br><\/p>\n\n\n\n<p>This month Sri Lanka was the victim of series of terrorist attacks, claimed by ISIS and rumoured as retaliation for the Christchurch massacre, <a href=\"https:\/\/www.socialmediatoday.com\/news\/sri-lankas-decision-to-block-social-media-highlights-rising-concerns-from\/553212\/\">the Sri Lankan government swiftly decided to block all social media<\/a> to avoid the spread of misinformation because it had no faith in social networks ability to control content on platforms such as Facebook, Instagram YouTube, Snapchat, WhatsApp and Viber (Hutchinson, 2019). New Zealand Prime Minister Jacinda Adern who has stated that Facebook shares some responsibility in the video, has announced \u201cChristchurch Calling\u201d an event scheduled for &nbsp;May where global leaders have invited social media executives to plan how to \u201celiminate terrorist and violent extremist content on social media\u201d (&#8220;Christchurch terror attack video a type not seen before, says Facebook&#8221;, 2019). Australia automatically passed the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Bill 2019 in response the events in Christchurch which will see tech executives serve jail time, while the UK has introduced the Online Harms White Paper which aims to keep users in the UK safe from online extremism. A parliamentary hearing in the UK into hate crime last week <a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2019-04-24\/facebook-says-first-person-christchurch-video-foiled-ai-system?utm_source=google&amp;utm_medium=bd&amp;cmpId=google\">observed Stephen Doughty MP admonish directors of Facebook, Twitter and Google<\/a> who stated \u201cYour systems are simply not working and quite frankly it\u2019s a cesspit. It feels like your companies don\u2019t give a damn. You give a lot of rhetoric, but you don\u2019t take action\u201d (Lanxon, 2019).<\/p>\n\n\n\n<p><strong>Where\nto next?<\/strong><\/p>\n\n\n\n<p>While\ngovernment bodies can instigate change and those that control SNSs should take\ngreater responsibility for the accuracy of what is published on their platform,\nindividuals need to take responsibility for their own agency and be proactive\nabout their choice in news consumption and distribution. Fact checking,\naccessing more diverse content and being open to differing opinion rather than\nrelying on the \u201calgorithms of their newsfeed and the ideological diversity, or\nthe lack of such, of their social media network\u201d (Spohr, 2017) is a start on\nreopening the lines of communication and raise the standard of what they are\nconsuming. Studies also indicate that when an individual lowers their standard\nof what news they engage with, they are more susceptible to fake news shared by\nconnections on social media (Spohr, 2017). Social media has created a state of apathy searching for news as the\naccessibility in news feeds gives the illusion of being well-informed and up to\ndate as the news to come to them. Unfortunately, fake news and lack of diverse\nopinion are not only limited to social media platforms as mainstream media\noutlets have become more partisan in their approach to political reporting,\nmotivated by the ratings that accompany adverse polarisation.<\/p>\n\n\n\n<p>To lessen incidences of\nradicalisation and future acts of extremism accessibility to &nbsp;a diversification of voices and exposure to\ncross-cutting political ideals are urgently required within social media, and\nneeds to be facilitated by the platform by altering the curated algorithm to\ninclude partisan information. The current SNS business model that promotes\npopularity over fact requires action from the organisations and governing\ninstitutions and which is slowly being acknowledged. However, in the meantime\nthis responsibility rests with the individual. It is no\nlonger enough to believe you are being kept informed without proactively seeking\nnews, and importantly different viewpoints to participate in a Web 2.0\ndemocracy.<br><\/p>\n\n\n\n<p>References:<\/p>\n\n\n\n<p>Bakshy,\nE., Messing, S., &amp; Adamic, L. (2015). Exposure to ideologically diverse\nnews and opinion on Facebook.&nbsp;<em>Science<\/em>,&nbsp;<em>348<\/em>(6239),\n1130-1132. doi: 10.1126\/science.aaa1160<\/p>\n\n\n\n<p>Cox,\nJ. (2019). 36 Days After Christchurch, Terrorist Attack Videos Are Still on\nFacebook. Retrieved from\nhttps:\/\/motherboard.vice.com\/en_us\/article\/43jdbj\/christchurch-attack-videos-still-on-facebook-instagram<\/p>\n\n\n\n<p>Howard,\nP. (2016). Is Social Media Killing Democracy? \u2014 Oxford Internet Institute.\nRetrieved from https:\/\/www.oii.ox.ac.uk\/blog\/is-social-media-killing-democracy\/<\/p>\n\n\n\n<p>Hutchinson,\nA. (2019). Sri Lanka&#8217;s Decision to Block Social Media Highlights Rising\nConcerns from Governments. Retrieved from\nhttps:\/\/www.socialmediatoday.com\/news\/sri-lankas-decision-to-block-social-media-highlights-rising-concerns-from\/553212\/<\/p>\n\n\n\n<p>Hyun,\nK. and Kim, J. (2015). Differential and interactive influences on political\nparticipation by different types of news activities and political conversation\nthrough social media.&nbsp;<em>Computers in Human Behavior<\/em>, 45, pp.328-334.<\/p>\n\n\n\n<p>Ingram,\nD., &amp; Collins, B. (2019). Facebook bans white nationalism from platform\nafter pressure from civil rights groups. Retrieved from https:\/\/www.nbcnews.com\/tech\/tech-news\/facebook-bans-white-nationalism-after-pressure-civil-rights-groups-n987991<\/p>\n\n\n\n<p>Khosravinik, M. (2017). Right Wing Populism in the West: Social Media\nDiscourse and Echo Chambers.&nbsp;<em>Insight Turkey<\/em>,&nbsp;<em>19<\/em>(3),\n53-68. doi: 10.25253\/99.2017193.04<\/p>\n\n\n\n<p>Kim, Y. (2011). The contribution of social network sites to exposure to\npolitical difference: The relationships among SNSs, online political messaging,\nand exposure to cross-cutting perspectives.&nbsp;<em>Computers In Human Behavior<\/em>,&nbsp;<em>27<\/em>(2),\n971-977. doi: 10.1016\/j.chb.2010.12.001<\/p>\n\n\n\n<p>Lanxon,\nN. (2019). Facebook Says First-Person Christchurch Video Foiled AI System.\nRetrieved from\nhttps:\/\/www.bloomberg.com\/news\/articles\/2019-04-24\/facebook-says-first-person-christchurch-video-foiled-ai-system?utm_source=google&amp;utm_medium=bd&amp;cmpId=google<\/p>\n\n\n\n<p>Mitchell,\nA., Matsa, K., Gottfried, J., &amp; Kiley, J. (2014). Political Polarization\n&amp; Media Habits. Retrieved from https:\/\/www.journalism.org\/2014\/10\/21\/political-polarization-media-habits\/<\/p>\n\n\n\n<p>Mullainathan,\nS., &amp; Shleifer, A. (2005). The Market for News.&nbsp;<em>The American\nEconomic Review,<\/em>&nbsp;<em>95<\/em>(4), 1031-1053. Retrieved from\nhttp:\/\/www.jstor.org\/stable\/4132704<\/p>\n\n\n\n<p> Nelson, J. L., &amp; Taneja, H. (2018). The small, disloyal fake news audience: The role of audience availability in fake news consumption. <em>New Media &amp; Society<\/em>, 20(10), 3720\u20133737. https:\/\/doi.org\/10.1177\/1461444818758715<\/p>\n\n\n\n<p>Pariser,\nE. (2012).&nbsp;<em>The filter bubble<\/em>. London: Viking.<\/p>\n\n\n\n<p>Spohr,\nD. (2017). Fake news and ideological polarization: Filter bubbles and selective\nexposure on social media.&nbsp;<em>Business Information Review<\/em>,&nbsp;<em>34<\/em>(3),\n150-160. doi: 10.1177\/0266382117722446<\/p>\n\n\n\n<p>Standing\nAgainst Hate | Facebook Newsroom. (2019). Retrieved from\nhttps:\/\/newsroom.fb.com\/news\/2019\/03\/standing-against-hate\/<\/p>\n\n\n\n<p>Tufekci,\nZ. (2018). Opinion | YouTube, the Great Radicalizer. Retrieved from\nhttps:\/\/www.nytimes.com\/2018\/03\/10\/opinion\/sunday\/youtube-politics-radical.html<\/p>\n\n\n\n<p>Wattles,\nJ., &amp; O&#8217;Sullivan, D. (2019). Facebook&#8217;s Mark Zuckerberg calls for more\nregulation of the internet. Retrieved from\nhttps:\/\/edition.cnn.com\/2019\/03\/30\/tech\/facebook-mark-zuckerberg-regulation\/index.html<\/p>\n\n\n\n<p>Yu,\nR. (2016). The relationship between passive and active non-political social\nmedia use and political expression on Facebook and Twitter.&nbsp;<em>Computers\nIn Human Behavior<\/em>,&nbsp;<em>58<\/em>, 413-420. doi: 10.1016\/j.chb.2016.01.019<\/p>\n\n\n\n<p> <br>This work is licensed under a&nbsp;<a href=\"http:\/\/creativecommons.org\/licenses\/by-sa\/4.0\/\">Creative Commons Attribution-ShareAlike 4.0 International License<\/a>. <\/p>\n","protected":false},"excerpt":{"rendered":"<p>The Internet once represented an advancement in global democracy where the opportunity to share knowledge was no longer bound by limitations of personal experience. Instead, we have been siloed into our preconceived biases and ideology while social network algorithms create echo chambers that can result in extremist behaviours. A consideration of social medias role and responsibility in the Christchurch massacre.<\/p>\n","protected":false},"author":14,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[34,114,111,113,117,8,41,118,119,112,7,21,116],"class_list":["post-203","post","type-post","status-publish","format-standard","hentry","category-social","tag-sns","tag-algorithms","tag-christchurch","tag-echo-chamber","tag-extremism","tag-facebook","tag-fake-news","tag-filter-bubbles","tag-google","tag-radicalisation","tag-social-media","tag-social-networks","tag-youtube"],"_links":{"self":[{"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/posts\/203","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/users\/14"}],"replies":[{"embeddable":true,"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/comments?post=203"}],"version-history":[{"count":8,"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/posts\/203\/revisions"}],"predecessor-version":[{"id":225,"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/posts\/203\/revisions\/225"}],"wp:attachment":[{"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/media?parent=203"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/categories?post=203"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/networkconference.netstudies.org\/2019Open\/wp-json\/wp\/v2\/tags?post=203"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}