News media and public shared reality. Fake news, incomplete news, alternative facts, strategic inference, kompromat, agnotology, facebooking to a molecular level. Basic media literacy and whether it helps. As seen in elections, and provocateur twitter bots.
…newspapers such as the Scarfolk Mail realised that they no longer needed to provide actual content: Readers only saw what they wanted to see and comprehended what they wanted to comprehend.
“Data journalism” has interesting tools. But do people care about data? Are facts persuasive? As Gilad Lotan anecdotally illustrates, merely selecting facts can get you your own little reality, without even bothering to lie. As Gwern on points out in Littlewood’s Law of Media, the anecdotes we can produce grow increasingly… odd.
[This] illustrates a version of Littlewood’s Law of Miracles: in a world with ~8 billion people, one which is increasingly networked and mobile and wealthy at that, a one-in-billion event will happen 8 times a month.
Human extremes are not only weirder than we suppose, they are weirder than we can suppose.
But let’s, for a moment, assume that people actually have intent to come to a shared understanding of the facts reality, writ large and systemic. Do they even have the skills? I don’t know, but it is hard to work out when you are being fed bullshit and we don’t do well at teaching that. There are courses on identifying the lazier type of bullshit
Will all the billions of humans on earth take such a course? Would they deploy the skills they learned thereby even if they did?
And, given that society is complex and counter intuitive even if we are doing simple analysis of correlation, how about more complex causation, such as feedback loops? Nicky Case has a diagrammatic account of how “systems journalism” might work.
Let’s get real here; for the moment, reasoned engagement with a shared rational enlightenment doesn’t dominate the media. Bread and circuses and kompromat and gut-instinct do.
Seeking news from traditional sources—newspapers and magazines—has been replaced with a new model: getting all of one’s news from trending stories on social networks. The people that we know best are most likely to influence us because we trust them. Their ideas and beliefs shape ours. And the tech behind social networks is built to enhance this …
Once a user joins a single group on Facebook, the social network will suggest dozens of others on that topic, as well as groups focused on tangential topics that people with similar profiles also joined. That is smart business. However, with unchecked content, it means that once people join a single conspiracy-minded group, they are algorithmically routed to a plethora of others. Join an anti-vaccine group, and your suggestions will include anti-GMO, chemtrail watch, flat Earther (yes, really), and “curing cancer naturally” groups. Rather than pulling a user out of the rabbit hole, the recommendation engine pushes them further in. We are long past merely partisan filter bubbles and well into the realm of siloed communities that experience their own reality and operate with their own facts.
See also her posts on the digital Maginot Line.
Kate Starbird’s study of agents provocateur in online campaigns.
For a false binary, try Nick Cohen, Trump’s lies are not the problem. It’s the millions who swallow them who really matter:
Compulsive believers are not just rednecks. They include figures as elevated as the British prime minister and her cabinet. …
Mainstream journalists are almost as credulous. After decades of imitating Jeremy Paxman and seizing on the trivial gaffes and small lies of largely harmless politicians, they are unable to cope with the fantastic lies of the new authoritarian movements. When confronted with men who lie so instinctively they believe their lies as they tell them, they can only insist on a fair hearing for the sake of “balance”. Their acceptance signals to the audience the unbelievable is worthy of belief.
This is a shallow causal analysis; However, thinking about the credulity of people in power is interesting.
- THE COMPROP NAVIGATOR >Welcome! This is an online resource guide for civil society groups looking to better deal with the problem of disinformation. Let us know your concerns and we will suggest resources, curated by civil society practitioners and the Project on Computational Propaganda. gear
- fullfact is a full-time fact checking agency in the UK, who do fact checking and reports such as Tackling misinformation in an Open Society
- The previous organisation I found via Data Skeptic Podcast’s Fake News Series
- An amusing portrait of snopes
- How much of the internet is fake?
Unfiltered news doesn’t share well, not at all:
- It can be emotional, but in the worse sense; no one is willing to spread a gruesome account from Mosul among his/er peers.
- Most likely, unfiltered news will convey a negative aspect of society. Again, another revelation from The Intercept or ProPublica won’t get many clicks.
- Unfiltered news can upset users’ views, beliefs, or opinions.
Tim Harford, The Problem With Facts:
… will this sudden focus on facts actually lead to a more informed electorate, better decisions, a renewed respect for the truth? The history of tobacco suggests not. The link between cigarettes and cancer was supported by the world’s leading medical scientists and, in 1964, the US surgeon general himself. The story was covered by well-trained journalists committed to the values of objectivity. Yet the tobacco lobbyists ran rings round them.
In the 1950s and 1960s, journalists had an excuse for their stumbles: the tobacco industry’s tactics were clever, complex and new. First, the industry appeared to engage, promising high-quality research into the issue. The public were assured that the best people were on the case. The second stage was to complicate the question and sow doubt: lung cancer might have any number of causes, after all. And wasn’t lung cancer, not cigarettes, what really mattered? Stage three was to undermine serious research and expertise. Autopsy reports would be dismissed as anecdotal, epidemiological work as merely statistical, and animal studies as irrelevant. Finally came normalisation: the industry would point out that the tobacco-cancer story was stale news. Couldn’t journalists find something new and interesting to say?
… In 1995, Robert Proctor, a historian at Stanford University who has studied the tobacco case closely, coined the word “agnotology”. This is the study of how ignorance is deliberately produced; the entire field was started by Proctor’s observation of the tobacco industry. The facts about smoking — indisputable facts, from unquestionable sources — did not carry the day. The indisputable facts were disputed. The unquestionable sources were questioned. Facts, it turns out, are important, but facts are not enough to win this kind of argument.
A few years ago, a cynical commentator described the “dead cat” strategy, to be deployed when losing an argument at a dinner party: throw a dead cat on the table. The awkward argument will instantly cease, and everyone will start losing their minds about the cat. The cynic’s name was Boris Johnson.
The tactic worked perfectly in the Brexit referendum campaign. Instead of a discussion of the merits and disadvantages of EU membership, we had a frenzied dead-cat debate over the true scale of EU membership fees.
For more hot tips like that, try The Alt-Right Playbook.
The most comprehensive study to date of the growing universe of partisan websites and Facebook pages about US politics reveals that in 2016 alone at least 187 new websites launched, and that the candidacy and election of Donald Trump has unleashed a golden age of aggressive, divisive political content that reaches a massive amount of people on Facebook.
Thanks to a trinity of the internet, Facebook, and online advertising, partisan news websites and their associated Facebook pages are almost certainly making more money for more people and reaching more Americans than at any time in history. In some cases, publishers are generating hundreds of thousands of dollars a month in revenue, with small operations easily earning five figures thanks to one website and at least one associated Facebook page.
At its root, the analysis of 667 websites and 452 associated Facebook pages reveals the extent to which American online political discourse is powered by a mix of money and outrage.
(Goel, Mason, and Watts 2010):
It is often asserted that friends and acquaintances have more similar beliefs and attitudes than do strangers; yet empirical studies disagree over exactly how much diversity of opinion exists within local social networks and, relatedly, how much awareness individuals have of their neighbors’ views. This article reports results from a network survey, conducted on the Facebook social networking platform, in which participants were asked about their own political attitudes, as well as their beliefs about their friends’ attitudes. Although considerable attitude similarity exists among friends, the results show that friends disagree more than they think they do. In particular, friends are typically unaware of their disagreements, even when they say they discuss the topic, suggesting that discussion is not the primary means by which friends infer each other’s views on particular issues. Rather, it appears that respondents infer opinions in part by relying on stereotypes of their friends and in part by projecting their own views. The resulting gap between real and perceived agreement may have implications for the dynamics of political polarization and theories of social influence in general.
(Watts and Dodds 2007):
A central idea in marketing and diffusion research is that influentials— a minority of individuals who influence an exceptional number of their peers— are important to the formation of public opinion. Here we examine this idea, which we call the “influentials hypothesis,” using a series of computer simulations of interpersonal influence processes. Under most conditions that we consider, we find that large cascades of influence are driven not by influentials but by a critical mass of easily influenced individuals. Although our results do not exclude the possibility that influentials can be important, they suggest that the influentials hypothesis requires more careful specification and testing than it has received.
(Martin and Yurukoglu 2017:):
On the right, audiences concentrate attention on purely right wing outlets. On the left and center audiences spread their attention broadly and focus on mainstream organizations. This asymmetric pattern holds for the linking practices of media producers. Both supply and demand on the right are insular and self-focused. On the left and center they are spread broadly and anchored by professional press.
These differences create a different dynamic for media, audiences, and politicians on the left and right.
We all like to hear news that confirms our beliefs and identity. On the left, outlets and politicians try to attract readers by telling such stories but are constrained because their readers are exposed to a range of outlets, many of which operate with strong fact-checking norms.
On the right, because audiences do not trust or pay attention to outlets outside their own ecosystem, there is no reality check to constrain competition. Outlets compete on political purity and stoking identity-confirming narratives. Outlets and politicians who resist the flow by focusing on facts are abandoned or vilified by audiences and competing outlets. This forces media and political elites to validate and legitimate the falsehoods, at least through silence, creating a propaganda feedback loop. …
The highly asymmetric pattern of media ecosystems we observe on the right and the left, despite the fact that Facebook and Twitter usage is roughly similar on both sides, requires that we look elsewhere for what is causing the difference.
Surveys make it clear that Fox News is by far the most influential outlet on the American right — more than five times as many Trump supporters reported using Fox News as their primary news outlet than those who named Facebook. And Trump support was highest among demographics whose social media use was lowest.
Our data repeatedly show Fox as the transmission vector of widespread conspiracy theories.
Farrell (n.d.) claim Fox News moved the 2008 presidential election Republican vote share by 6.3% to the right. (I have not read this article yet.
In other results, we estimate that removing Fox News from cable television during the 2000 election cycle would have reduced the overall Republican presidential vote share by 0.46 percentage points. The predicted effect increases in 2004 and 2008 to 3.59 and 6.34 percentage points, respectively. This increase is driven by increasing viewership on Fox News as well as an increasingly conservative slant.
Witness an incredible transformation! In less than two years an entire wing of the Republican Party metamorphosed from a collection of free trade zealot into a cartel of pro-tariff troglodytes! Why did this happen? Did the Trump administration dismantle a national system of spies, censors, and political prisoners? Were the Republican masses really anti-trade all along, but afraid to proclaim their opinion until their man was in power?
Political convictions do not work the way most people think they do.
“It is difficult to get a man to understand something,” quipped Upton Sinclair, “when his salary depends upon his not understanding it!”  It is easy to see the truth behind this statement. But also easy to see are its limitations. Very few Americans have a direct financial stake in same-sex marriage, nuclear power, the size of the American military, or a balanced budget. The number of Americans whose salary is dependent on a particular understanding of affirmative action or global warming is very small. Even when an issue of public importance has a direct and undeniable impact on household finances (say tax rates or health care), members of that household will often vote against the party whose policies would save them the most money. If asked to guess an American’s opinion on tax cuts, you are would be better off knowing whether that person is pro-life or pro-choice than you would knowing their income.
The study identifies three essential roles for Twitter users. Partisan users both consume and produce content with only a one-sided leaning and enjoy high appreciation measured by both network centrality and content endorsement. Gatekeepers have a central role in the formation of echo chambers because they consume content with diverse leanings but choose to produce only content with a one-sided leaning. Bipartisan users produce content with both leanings and make an effort to bridge the echo chambers, but they are less valued in their networks than their partisan counterparts.
A methodology that clusters into three discrete groups smells funky but I really need to read the paper to see what they actually did.
So let’s say you’re a media owner who’s “all business,” caring only for the bottom line, looking to keep shareholders and owners happy, buy yourself some expensive houses, and/or get yourself or your friends re-elected. Without a doubt, the journalists who are prepared to tell readers the truth—even about you and your friends!—can only be a hindrance, and are best shut up or rid of, to the extent possible.
Aistrope, Tim. 2016. “Social Media and Counterterrorism Strategy.” Australian Journal of International Affairs 70 (2): 121–38. https://doi.org/10.1080/10357718.2015.1113230.
Arif, Ahmer, Leo Graiden Stewart, and Kate Starbird. 2018. “Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse.” Proc. ACM Hum.-Comput. Interact. 2 (CSCW): 20:1–20:27. https://doi.org/10.1145/3274289.
Benkler, Yochai, Rob Faris, and Harold Roberts. 2018. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. New York, NY: Oxford University Press. http://fdslive.oup.com/www.oup.com/academic/pdf/openaccess/9780190923624.pdf.
Bessi, Alessandro. 2016. “On the Statistical Properties of Viral Misinformation in Online Social Media,” September. http://arxiv.org/abs/1609.09435.
Bradshaw, S., and P. Howard. 2017. “Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation” 2017.12. https://ora.ox.ac.uk/objects/uuid:cef7e8d9-27bf-4ea5-9fd6-855209b3e1f6.
Brito, Kellyton, Natalia Paula, Manoel Fernandes, and Silvio Meira. 2019. “Social Media and Presidential Campaigns – Preliminary Results of the 2018 Brazilian Presidential Election.” In Proceedings of the 20th Annual International Conference on Digital Government Research, 332–41. Dg.o 2019. Dubai, United Arab Emirates: ACM. https://doi.org/10.1145/3325112.3325252.
Broniatowski, David A., Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze. 2018. “Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate.” American Journal of Public Health 108 (10): 1378–84. https://doi.org/10.2105/AJPH.2018.304567.
Bursztyn, Victor S, and Larry Birnbaum. 2019. “Thousands of Small, Constant Rallies: A Large-Scale Analysis of Partisan WhatsApp Groups,” 6.
Cadwalladr, Carole. 2017. “The Great British Brexit Robbery: How Our Democracy Was Hijacked.” The Guardian: Technology, May 7, 2017. https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy.
Cha, Meeyoung, Hamed Haddadi, Fabricio Benevenuto, and Krishna P. Gummadi. 2010. “Measuring User Influence in Twitter: The Million Follower Fallacy.” In Fourth International AAAI Conference on Weblogs and Social Media. https://www.aaai.org/ocs/index.php/ICWSM/ICWSM10/paper/view/1538.
Coscia, Michele. 2017. “Popularity Spikes Hurt Future Chances for Viral Propagation of Protomemes.” Communications of the ACM 61 (1): 70–77. https://doi.org/10.1145/3158227.
Crooks, Andrew. n.d. “Bot Stamina: Examining the Influence and Staying Power of Bots in Online Social Networks.” Accessed August 15, 2019. https://www.gisagents.org/2019/08/bot-stamina-examining-influence-and.html.
Dittmar, Jeremiah E., and Skipper Seabold. 2015. “Media, Markets, and Radical Ideas: Evidence from the Protestant Reformation.” Centre for Economic Performance Working Paper. http://www.jeremiahdittmar.com/files/dittmar_seabold_print_religion.pdf.
Evans, David S. 2017. “The Economics of Attention Markets.” SSRN Scholarly Paper ID 3044858. Rochester, NY: Social Science Research Network. https://papers.ssrn.com/abstract=3044858.
Farrell, Henry. n.d. “Analysis | Blame Fox, Not Facebook, for Fake News.” Washington Post: Monkey Cage Analysis Analysis Interpretation of the News Based on Evidence, Including Data, as Well as Anticipating How Events Might Unfold Based on Past Events. Accessed September 16, 2019. https://www.washingtonpost.com/news/monkey-cage/wp/2018/11/06/blame-fox-not-facebook-for-fake-news/.
Farrell, Henry, and Bruce Schneier. 2018. “Common-Knowledge Attacks on Democracy.” SSRN Scholarly Paper ID 3273111. Rochester, NY: Social Science Research Network. https://papers.ssrn.com/abstract=3273111.
Garimella, Kiran, Gianmarco De Francisci Morales, Aristides Gionis, and Michael Mathioudakis. 2018. “Political Discourse on Social Media: Echo Chambers, Gatekeepers, and the Price of Bipartisanship,” February. http://arxiv.org/abs/1801.01665.
Gelman, Andrew. 2007. “Struggles with Survey Weighting and Regression Modeling.” Statistical Science 22 (2): 153–64. https://doi.org/10.1214/088342306000000691.
Goel, Sharad, Ashton Anderson, Jake Hofman, and Duncan J. Watts. 2015. “The Structural Virality of Online Diffusion.” Management Science, July, 150722112809007. https://doi.org/10.1287/mnsc.2015.2158.
Goel, Sharad, Winter Mason, and Duncan J. Watts. 2010. “Real and Perceived Attitude Agreement in Social Networks.” Journal of Personality and Social Psychology 99 (4): 611–21. https://doi.org/10.1037/a0020697.
Goel, Sharad, Duncan J. Watts, and Daniel G. Goldstein. 2012. “The Structure of Online Diffusion Networks.” In Proceedings of the 13th ACM Conference on Electronic Commerce - EC ’12, 623. Valencia, Spain: ACM Press. https://doi.org/10.1145/2229012.2229058.
Gonzalez-Bailon, Sandra. 2009. “Opening the Black Box of Link Formation: Social Factors Underlying the Structure of the Web.” Social Networks 31 (4): 271–80. https://doi.org/10.1016/j.socnet.2009.07.003.
Granovetter, Mark. 1983. “The Strength of Weak Ties: A Network Theory Revisited.” Sociological Theory 1 (1): 201–33. http://www.soc.ucsb.edu/faculty/friedkin/Syllabi/Soc148/Granovetter%201983.pdf.
Granovetter, Mark S. 1973. “The Strength of Weak Ties.” The American Journal of Sociology 78 (6): 1360–80. https://doi.org/10.2307/2776392.
Howard, Philip N., and Bence Kollanyi. 2016. “Bots, #StrongerIn, and #Brexit: Computational Propaganda During the UK-EU Referendum.” Browser Download This Paper. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2798311.
Jackson, Matthew O. 2018. “The Friendship Paradox and Systematic Biases in Perceptions and Social Norms.” Journal of Political Economy 127 (2): 777–818. https://doi.org/10.1086/701031.
Johnson, Hollyn M., and Colleen M. Seifert. 1994. “Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences.” Learning, Memory 20 (6): 1420–36. https://www.researchgate.net/profile/Colleen_Seifert/publication/232501255_Sources_of_the_Continued_Influence_Effect_When_Misinformation_in_Memory_Affects_Later_Inferences/links/5485d4070cf268d28f0045c0.pdf.
Kempe, David, Jon Kleinberg, and Éva Tardos. 2003. “Maximizing the Spread of Influence Through a Social Network.” In Proceedings of the Ninth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 137–46. KDD ’03. Washington, D.C.: ACM. https://doi.org/10.1145/956750.956769.
Klausen, Jytte. 2015. “Tweeting the Jihad: Social Media Networks of Western Foreign Fighters in Syria and Iraq.” Studies in Conflict & Terrorism 38 (1): 1–22. https://doi.org/10.1080/1057610X.2014.974948.
Lee, Eun, Fariba Karimi, Hang-Hyun Jo, Markus Strohmaier, and Claudia Wagner. 2017. “Homophily Explains Perception Biases in Social Networks.” arXiv:1710.08601 [Physics], October. http://arxiv.org/abs/1710.08601.
Lerman, Kristina, Xiaoran Yan, and Xin-Zeng Wu. 2016. “The "Majority Illusion" in Social Networks.” PLOS ONE 11 (2): e0147617. https://doi.org/10.1371/journal.pone.0147617.
Levy, Gilat, and Ronny Razin. 2019. “Echo Chambers and Their Effects on Economic and Political Outcomes.” Annual Review of Economics 11 (1): 303–28. https://doi.org/10.1146/annurev-economics-080218-030343.
Lewis, Rebecca. n.d. “Broadcasting the Reactionary Right on YouTube,” 61.
Lin, Herbert, and Jaclyn Kerr. 2019. “On Cyber-Enabled Information Warfare and Information Operations.” SSRN Scholarly Paper ID 3015680. Rochester, NY: Social Science Research Network. https://papers.ssrn.com/abstract=3015680.
Machado, Caio, Beatriz Kira, Vidya Narayanan, Bence Kollanyi, and Philip Howard. 2019. “A Study of Misinformation in WhatsApp Groups with a Focus on the Brazilian Presidential Elections.” In Companion Proceedings of the 2019 World Wide Web Conference, 1013–9. WWW ’19. San Francisco, USA: ACM. https://doi.org/10.1145/3308560.3316738.
Martin, Gregory J., and Ali Yurukoglu. 2017. “Bias in Cable News: Persuasion and Polarization.” American Economic Review 107 (9): 2565–99. https://doi.org/10.1257/aer.20160812.
Marwick, Alice, and Rebecca Lewis. 2017. “Media Manipulation and Disinformation Online.” Data & Society Research Institute. https://datasociety.net/output/media-manipulation-and-disinfo-online/.
Munn, Luke. 2019. “Alt-Right Pipeline: Individual Journeys to Extremism Online.” First Monday 24 (6). https://doi.org/10.5210/fm.v24i6.10108.
Oliver, Eric, and Tom Wood. 2014. “Larger Than Life.” New Scientist 224 (3000): 36–37. https://doi.org/10.1016/S0262-4079(14)62441-6.
Oliver, J. Eric, and Thomas J. Wood. 2014. “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science 58 (4): 952–66. https://doi.org/10.1111/ajps.12084.
Olteanu, Alexandra, Carlos Castillo, Fernando Diaz, and Emre Kıcıman. 2019. “Social Data: Biases, Methodological Pitfalls, and Ethical Boundaries.” Frontiers in Big Data 2. https://doi.org/10.3389/fdata.2019.00013.
Redlawsk, David P., Andrew J. W. Civettini, and Karen M. Emmerson. 2010. “The Affective Tipping Point: Do Motivated Reasoners Ever ‘Get It’?” Political Psychology 31 (4): 563–93. https://doi.org/10.1111/j.1467-9221.2010.00772.x.
Ribeiro, Manoel Horta, Raphael Ottoni, Robert West, Virgílio A. F. Almeida, and Wagner Meira. 2019. “Auditing Radicalization Pathways on YouTube,” August. http://arxiv.org/abs/1908.08313.
Rieder, Bernhard, Ariadna Matamoros-Fernández, and Òscar Coromina. 2018. “From Ranking Algorithms to ‘Ranking Cultures’: Investigating the Modulation of Visibility in YouTube Search Results.” Convergence 24 (1): 50–68. https://doi.org/10.1177/1354856517736982.
Salganik, Matthew J., and Duncan J. Watts. 2008. “Leading the Herd Astray: An Experimental Study of Self-Fulfilling Prophecies in an Artificial Cultural Market.” Social Psychology Quarterly 74 (4): 338. https://doi.org/10.1177/019027250807100404.
Schuchard, Ross, Andrew T. Crooks, Anthony Stefanidis, and Arie Croitoru. 2019. “Bot Stamina: Examining the Influence and Staying Power of Bots in Online Social Networks.” Applied Network Science 4 (1): 1–23. https://doi.org/10.1007/s41109-019-0164-x.
Shakya, Holly B., and Nicholas A. Christakis. 2017. “Association of Facebook Use with Compromised Well-Being: A Longitudinal Study.” American Journal of Epidemiology 185 (3). https://doi.org/10.1093/aje/kww189.
Softky, William, and Criscillia Benford. 2017. “Sensory Metrics of Neuromechanical Trust.” Neural Computation 29 (9): 2293–2351. https://doi.org/10.1162/neco_a_00988.
Staines, Cassie, and Will Moy. 2018. “Tackling Misinformation in an Open Society.” https://fullfact.org/media/uploads/full_fact_tackling_misinformation_in_an_open_society.pdf.
Starbird, Kate. 2017. “Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter.” In Eleventh International AAAI Conference on Web and Social Media. https://www.aaai.org/ocs/index.php/ICWSM/ICWSM17/paper/view/15603.
———. 2019. “Disinformation’s Spread: Bots, Trolls and All of Us.” Nature 571 (July): 449. https://doi.org/10.1038/d41586-019-02235-x.
Stewart, Leo G, Ahmer Arif, and Kate Starbird. 2018. “Examining Trolls and Polarization with a Retweet Network,” 6.
Stewart, Leo Graiden, Ahmer Arif, A. Conrad Nied, Emma S. Spiro, and Kate Starbird. 2017. “Drawing the Lines of Contention: Networked Frame Contests Within #BlackLivesMatter Discourse.” Proceedings of the ACM on Human-Computer Interaction 1 (CSCW): 1–23. https://doi.org/10.1145/3134920.
Valente, Thomas W., and Stephanie R. Pitts. 2017. “An Appraisal of Social Network Theory and Analysis as Applied to Public Health: Challenges and Opportunities.” Annual Review of Public Health 38 (1): 103–18. https://doi.org/10.1146/annurev-publhealth-031816-044528.
Vosoughi, Soroush, Deb Roy, and Sinan Aral. 2018. “The Spread of True and False News Online.” Science 359 (6380): 1146–51. https://doi.org/10.1126/science.aap9559.
Watts, Duncan J., and Peter Sheridan Dodds. 2007. “Influentials, Networks, and Public Opinion Formation.” Journal of Consumer Research 34 (4): 441–58. https://doi.org/10.1086/518527.
West, Jevin D., and Carl T. Bergstrom. 2011. “Can Ignorance Promote Democracy?” Science 334 (6062): 1503–4. https://doi.org/10.1126/science.1216124.
Wilson, Tom, Kaitlyn Zhou, and Kate Starbird. 2018. “Assembling Strategic Narratives: Information Operations as Collaborative Work Within an Online Community.” Proc. ACM Hum.-Comput. Interact. 2 (CSCW): 183:1–183:26. https://doi.org/10.1145/3274452.
Winter, Aaron. 2019. “Online Hate: From the Far-Right to the ‘Alt-Right’ and from the Margins to the Mainstream.” In Online Othering: Exploring Digital Violence and Discrimination on the Web, edited by Karen Lumsden and Emily Harmer, 39–63. Palgrave Studies in Cybercrime and Cybersecurity. Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-12633-9_2.