Misinformation, disinformation etc

Trolls, bots, lulz, infowars and other moods of the modern networked ape

October 21, 2019 — October 6, 2024

adversarial
catastrophe
computers are awful together
confidentiality
democracy
economics
evolution
game theory
incentive mechanisms
insurgency
networks
P2P
social graph
virality
wonk

Content warning:

Discussion of terrorism and hate speech

Figure 1

The notebook formerly known as “Weaponized social media”.

I don’t think that name was great, but also I am not a fan of the “misinformation/disinformation” naming either; it’s just the one that everyone has decided to use in this most accursed timeline.

Although I am comfortable arguing weaponization happens, and disinformation happens, empirically, analysis that starts from these terms seems to go off the rails relatively quickly. The more productive question must be something more like “how can we collectively converge on truth?” (As opposed to, for example, “how can we defeat the enemy?”) Questions that start from accusations lead to self-defeating disputes. I’ve seen things, having formerly been on twitter.

The big-picture problem we want to solve here is mechanism design for truth-finding in adversarial, attention-constrained environments. And we want to do so while being prepared to have our methods and questions be instantly weaponized the moment we bring them to bear.

Large malevolent forces are active online, of course. But our collective informational cooperation might be able to deal with that, if we could take enough time to work together on that problem rather than accusing each other of being shills for someone else.

This notebook has lost cohesion, turning into a link salad. It would be more valuable if it analysed and compared some theses; in this topic area we need fewer listicles and more synthesis. Until I fix that, I am part of the problem.

Candidate for merging with the spamularity notebook.

Memetic information warfare on the social information graph with viral media for the purpose of human behaviour control and steering epistemic communities. The other side to trusted news; hacking the implicit reputation system of social media to suborn factual reporting, or to motivate people to behave to suit your goals, to, e.g. sell uncertainty.

News media and public shared reality. Fake news, incomplete news, alternative facts, strategic inference, general incompetence. kompromat, agnotology, facebooking to a molecular level. Basic media literacy and whether it helps. As seen in elections, and provocateur twitter bots.

Research in this area is vague for many reasons; It is hard to do experiments on societies at large for reasons of ethics and practicality, at least for most of us. Also, our tools for causality on social graphs are weak, and the problem is hard. There are some actors (nation states, social media companies) for which experiments are practical, but they have various reasons for not sharing their results. No reason not to try, though, to understand the state of things.

2 Fact checking

As Gwern points out, Littlewood’s Law of Media implies the anecdotes we can recount, in all truthfulness, grow increasingly weird as the population does. In a large enough sample you can find a small number of occurrences to support any hypothesis you would like.

[This] illustrates a version of Littlewood’s Law of Miracles: in a world with ~8 billion people, one which is increasingly networked and mobile and wealthy at that, a one-in-billion event will happen 8 times a month.

Human extremes are not only weirder than we suppose, they are weirder than we can suppose.

But let’s, for a moment, assume that people actually have intent to come to a shared understanding of the facts reality, writ large and systemic. Do they even have the skills? I don’t know, but it is hard to work out when you are being fed bullshit and we don’t do well at teaching that. There are courses on identifying the lazier type of bullshit

and even courses on more sophisticated bullshit detection

Craig Silverman (ed), Verification Handbook For Disinformation And Media Manipulation.

Will all the billions of humans on earth take such a course? Would they deploy the skills they learned thereby even if they did?

And, given that society is complex and counter-intuitive even if we are doing simple analysis of correlation, how about more complex causation, such as feedback loops? Nicky Case created a diagrammatic account of how “systems journalism” might work.

Tim Harford, The Problem With Facts:

[…] will this sudden focus on facts actually lead to a more informed electorate, better decisions, a renewed respect for the truth? The history of tobacco suggests not. The link between cigarettes and cancer was supported by the world’s leading medical scientists and, in 1964, the US surgeon general himself. The story was covered by well-trained journalists committed to the values of objectivity. Yet the tobacco lobbyists ran rings round them.

In the 1950s and 1960s, journalists had an excuse for their stumbles: the tobacco industry’s tactics were clever, complex and new. First, the industry appeared to engage, promising high-quality research into the issue. The public were assured that the best people were on the case. The second stage was to complicate the question and sow doubt: lung cancer might have any number of causes, after all. And wasn’t lung cancer, not cigarettes, what really mattered? Stage three was to undermine serious research and expertise. Autopsy reports would be dismissed as anecdotal, epidemiological work as merely statistical, and animal studies as irrelevant. Finally came normalisation: the industry would point out that the tobacco-cancer story was stale news. Couldn’t journalists find something new and interesting to say?

[…] In 1995, Robert Proctor, a historian at Stanford University who has studied the tobacco case closely, coined the word “agnotology”. This is the study of how ignorance is deliberately produced; the entire field was started by Proctor’s observation of the tobacco industry. The facts about smoking — indisputable facts, from unquestionable sources — did not carry the day. The indisputable facts were disputed. The unquestionable sources were questioned. Facts, it turns out, are important, but facts are not enough to win this kind of argument.

3 Conspiracy theories and their uses

see Conspiracy mania.

3.1 Partisanship in social media firms

That WSJ article about how facebook wants to cause strife.

Allegations that Instagram’s Hashtag Blocking Favors Trump, Hurts Biden.

Facebook Fired An Employee Who Collected Evidence Of Right-Wing Pages Getting Preferential Treatment

“I do think we’re headed for a problematic scenario where Facebook is going to be used to aggressively undermine the legitimacy of the US elections, in a way that has never been possible in history,” one Facebook employee wrote in a group on Workplace, the company’s internal communication platform, earlier this week.

There are allegations about broken processes inside Facebook leading to complicity in interfering with politics.

Recent examples of this include

Craig Silverman et al’s “I Have Blood on My Hands”: A Whistleblower Says Facebook Ignored Global Political Manipulation covers allegations by Sophie Zhang that Facebook is strategically or incompetently failing to enforce “fake news” and especially astroturfing uses of their platform.

Hurting People At Scale

The documents — which include company discussion threads, employee survey results, and recordings of Zuckerberg — reveal that the company was slow to take down ads with white nationalist and Nazi content reported by its own employees. They demonstrate how the company’s public declarations about supporting racial justice causes are at odds with policies forbidding Facebookers from using company resources to support political matters. They show Zuckerberg being publicly accused of misleading his employees. Above all, they portray a fracturing company culture.

Frustrated and angry, employees are now challenging Zuckerberg and leadership at companywide meetings, staging virtual walkouts, and questioning if their work is making the world a better place. The turmoil has reached a point where Facebook’s CEO recently threatened to fire employees who “bully” their colleagues.

As it heads into a US presidential election where its every move will be dissected and analyzed, the social network is facing unprecedented internal dissent as employees worry that the company is wittingly or unwittingly exerting political influence on content decisions related to Trump, and fear that Facebook is undermining democracy.

Eugene Wei, TikTok and the Sorting Hat

To help a network break out from its early adopter group, you need both to bring lots of new people/subcultures into the app—that’s where the massive marketing spend helps—but also ways to help these disparate groups to 1) find each other quickly and 2) branch off into their own spaces.

More than any other feed algorithm I can recall, Bytedance’s short video algorithm fulfilled these two requirements. It is a rapid, hyper-efficient matchmaker. Merely by watching some videos, and without having to follow or friend anyone, you can quickly train TikTok on what you like. In the two sided entertainment network that is TikTok, the algorithm acts as a rapid, efficient market maker, connecting videos with the audiences they’re destined to delight. The algorithm allows this to happen without an explicit follower graph.

Just as importantly, by personalizing everyone’s FYP feeds, TikTok helped to keep these distinct subcultures, with their different tastes, separated. One person’s cringe is another person’s pleasure, but figuring out which is which is no small feat.

TikTok’s algorithm is the Sorting Hat from the Harry Potter universe.

4 Rhetorical strategies

Figure 4

Sea-lioning is a common hack for trolls, and is a whole interesting essay in strategic conversation derailment strategies. Here is one strategy against it, the FAQ off system for live FAQ. This is one of many dogpiling strategies that are effective online, where economies of scarce attention are important.

5 Inference

How do we evaluate the effects of social media interventions? Of course, standard survey modelling.

There is some structure to exploit here, e.g. causalimpact and other such time series-causal-inference systems. How about when the data is a mixture of time-series data and one-off results (e.g. polling before and election and the election itself)

Getting to the data is fraught:

Facebook’s Illusory Promise of Transparency is currently obstructing the Ad Observatory by NYU Tandon School of Engineering.

Various browser data-harvesting systems exist:

6 Automatic trolling, infinite fake news

Figure 5: Photo by Markus Winkler on Unsplash

The controversial GPT-x (Radford et al. 2019) family

GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets.

It takes 5 minutes to download this package and start generating decent fake news; Whether you gain anything over the traditional manual method is an open question.

The controversial deepcom model enables automatic comment generation for your fake news. (Yang et al. 2019)

Assembling these into a Twitter bot farm is left as an exercise for the student.

7 Post hoc analysis

Figure 6: Facebook selling elections

One firm promised to “use every tool and take every advantage available in order to change reality according to our client’s wishes.”

  • How Facebook Agreed to Suspend Netanyahu’s Chatbot

  • Aleksandr Kogan describes How Cambridge Analytica worked

  • Kate Starbird, the surprising nuance behind the Russian troll strategy

  • facebook political ad collector

  • Dan O’Sullivan, Inside the RNC Leak

    In what is the largest known data exposure of its kind, UpGuard’s Cyber Risk Team can now confirm that a misconfigured database containing the sensitive personal details of over 198 million American voters was left exposed to the internet by a firm working on behalf of the Republican National Committee (RNC) in their efforts to elect Donald Trump. The data, which was stored in a publicly accessible cloud server owned by Republican data firm Deep Root Analytics, included 1.1 terabytes of entirely unsecured personal information compiled by DRA and at least two other Republican contractors, TargetPoint Consulting, Inc. and Data Trust. In total, the personal information of potentially near all of America’s 200 million registered voters was exposed, including names, dates of birth, home addresses, phone numbers, and voter registration details, as well as data described as “modeled” voter ethnicities and religions. […]

    “‘Microtargeting is trying to unravel your political DNA,’ [Gage] said. ‘The more information I have about you, the better.’ The more information [Gage] has, the better he can group people into “target clusters” with names such as ‘Flag and Family Republicans’ or ‘Tax and Terrorism Moderates.’ Once a person is defined, finding the right message from the campaign becomes fairly simple.”

How ISIS Games Twitter:

But what’s often overlooked in press coverage is that ISIS doesn’t just have strong, organic support online. It also employs social-media strategies that inflate and control its message. Extremists of all stripes are increasingly using social media to recruit, radicalize and raise funds, and ISIS is one of the most adept practitioners of this approach.

British army creates team of Facebook warriors

The Israel Defence Forces have pioneered state military engagement with social media, with dedicated teams operating since Operation Cast Lead, its war in Gaza in 2008-9. The IDF is active on 30 platforms — including Twitter, Facebook, Youtube and Instagram — in six languages. “It enables us to engage with an audience we otherwise wouldn’t reach,” said an Israeli army spokesman. […] During last summer’s war in Gaza, Operation Protective Edge, the IDF and Hamas’s military wing, the Qassam Brigades, tweeted prolifically, sometimes engaging directly with one another.

Nick Statt, Facebook reportedly ignored its own research showing algorithms divided users:

An internal Facebook report presented to executives in 2018 found that the company was well aware that its product, specifically its recommendation engine, stoked divisiveness and polarization, according to a new report from The Wall Street Journal. “Our algorithms exploit the human brain’s attraction to divisiveness,” one slide from the presentation read. The group found that if this core element of its recommendation engine were left unchecked, it would continue to serve Facebook users “more and more divisive content in an effort to gain user attention & increase time on the platform.” A separate internal report, crafted in 2016, said 64 percent of people who joined an extremist group on Facebook only did so because the company’s algorithm recommended it to them, the _WSJ _reports.

Leading the effort to downplay these concerns and shift Facebook’s focus away from polarization has been Joel Kaplan, Facebook’s vice president of global public policy and former chief of staff under President George W. Bush. Kaplan is a controversial figure in part due to his staunch right-wing politics— he supported Supreme Court Justice Brett Kavanaugh throughout his nomination— and his apparent ability to sway CEO Mark Zuckerberg on important policy matters.

Figure 7: Ray Serrato, Anatomy of a Fake Antifa Tweet

Ray Serrator documents the kind of dynamics that we should be aware of here. One false-flag tweet circulated by partisans gets far more exposure as a evidence of the vileness of the people it purports to come from than does the belated take-down of that tweet.

8 Incoming

9 References

Aistrope. 2016. Social Media and Counterterrorism Strategy.” Australian Journal of International Affairs.
Allen, Farrell, and Shalizi. 2017. Evolutionary Theory and Endogenous Institutional Change.”
Arif, Stewart, and Starbird. 2018. Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse.” Proc. ACM Hum.-Comput. Interact.
Banerjee, Chandrasekhar, Duflo, et al. 2019. Using Gossips to Spread Information: Theory and Evidence from Two Randomized Controlled Trials.” The Review of Economic Studies.
Bay. 2018. Weaponizing the Haters: The Last Jedi and the Strategic Politicization of Pop Culture Through Social Media Manipulation.” First Monday.
Behr, Reding, Edwards, et al. 2013. Radicalisation in the Digital Era: The Use of the Internet in 15 Cases of Terrorism and Extremism.”
Benkler, Faris, and Roberts. 2018. Network propaganda: manipulation, disinformation, and radicalization in American politics.
Beskow. 2020. Finding and Characterizing Information Warfare Campaigns.”
Bessi. 2016. On the Statistical Properties of Viral Misinformation in Online Social Media.” arXiv:1609.09435 [Physics, Stat].
Bradshaw, and Howard. 2017. Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation.”
Brito, Paula, Fernandes, et al. 2019. Social Media and Presidential Campaigns – Preliminary Results of the 2018 Brazilian Presidential Election.” In Proceedings of the 20th Annual International Conference on Digital Government Research. Dg.o 2019.
Broniatowski, Jamison, Qi, et al. 2018. Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate.” American Journal of Public Health.
Bursztyn, and Birnbaum. 2019. “Thousands of Small, Constant Rallies: A Large-Scale Analysis of Partisan WhatsApp Groups.”
Cadwalladr. 2017. The Great British Brexit Robbery: How Our Democracy Was Hijacked.” The Guardian.
Callaway, Newman, Strogatz, et al. 2000. Network Robustness and Fragility: Percolation on Random Graphs.” Physical Review Letters.
Campbell. 2013. Social Networks and Political Participation.” Annual Review of Political Science.
Coscia. 2017. Popularity Spikes Hurt Future Chances for Viral Propagation of Protomemes.” Communications of the ACM.
Dittmar, and Seabold. 2015. Media, Markets, and Radical Ideas: Evidence from the Protestant Reformation.” Centre for Economic Performance Working Paper.
Dixon, Lerner, and Bashian. 2024. Challenges to Correcting Pluralistic Ignorance: False Consensus Effects, Competing Information Environments, and Anticipated Social Conflict.” Human Communication Research.
Dodds. 2017. Slightly Generalized Generalized Contagion: Unifying Simple Models of Biological and Social Spreading.” arXiv:1708.09697 [Physics].
Eady, Paskhalis, Zilinsky, et al. 2023. Exposure to the Russian Internet Research Agency Foreign Influence Campaign on Twitter in the 2016 US Election and Its Relationship to Attitudes and Voting Behavior.” Nature Communications.
Evans. 2017. The Economics of Attention Markets.” SSRN Scholarly Paper ID 3044858.
Farrell. n.d. Analysis | Blame Fox, Not Facebook, for Fake News.” Washington Post.
Farrell, and Schneier. 2018. Common-Knowledge Attacks on Democracy.” SSRN Scholarly Paper ID 3273111.
Farrell, and Shalizi. 2012. “Cognitive Democracy.” Crooked Timber.
Farrell, and Shalizi. 2021. 9 Pursuing Cognitive Democracy.” In 9 Pursuing Cognitive Democracy.
Farrell, and Shalizi. n.d.a. An Outline of Cognitive Democracy.”
———. n.d.b. “Evolutionary Theory and the Dynamics of Institutional Change.”
Freelon, Marwick, and Kreiss. 2020. False Equivalencies: Online Activism from Left to Right.” Science.
Gibney. 2018. The Scant Science Behind Cambridge Analytica’s Controversial Marketing Techniques.” Nature.
Goel, Anderson, Hofman, et al. 2015. The Structural Virality of Online Diffusion.” Management Science.
Goel, Hofman, Lahaie, et al. 2010. Predicting Consumer Behavior with Web Search.” Proceedings of the National Academy of Sciences.
Goel, Mason, and Watts. 2010. Real and Perceived Attitude Agreement in Social Networks.” Journal of Personality and Social Psychology.
Goel, Watts, and Goldstein. 2012. The Structure of Online Diffusion Networks.” In Proceedings of the 13th ACM Conference on Electronic Commerce - EC ’12.
Gonzalez-Bailon. 2009. Opening the Black Box of Link Formation: Social Factors Underlying the Structure of the Web.” Social Networks.
Granovetter, Mark S. 1973. The Strength of Weak Ties.” The American Journal of Sociology.
Granovetter, Mark. 1983. The Strength of Weak Ties: A Network Theory Revisited.” Sociological Theory.
Grossman, and Helpman. 2019. Electoral Competition with Fake News.” w26409.
Hamid, and Ariza. n.d. Offline Versus Online Radicalisation: Which Is the Bigger Threat?
Harwell. 2021. Lonely, Angry, Eager to Make History: Online Mobs Are Likely to Remain a Dangerous Reality.” Washington Post.
Hassan, Brouillette-Alarie, Alava, et al. 2018. Exposure to Extremist Online Content Could Lead to Violent Radicalization:A Systematic Review of Empirical Evidence.” International Journal of Developmental Science.
Hawkins, Yudkin, Juan-Torres, et al. 2019. Hidden Tribes: A Study of America’s Polarized Landscape.” Preprint.
Howard, and Kollanyi. 2016. Bots, #StrongerIn, and #Brexit: Computational Propaganda During the UK-EU Referendum.” Browser Download This Paper.
Hurd, and Gleeson. 2012. On Watts’ Cascade Model with Random Link Weights.” arXiv:1211.5708 [Cond-Mat, Physics:physics].
Imhoff, and Bruder. 2014. Speaking (Un-)Truth to Power: Conspiracy Mentality as a Generalised Political Attitude.” European Journal of Personality.
Jackson, Malladi, and McAdams. 2019. Learning Through the Grapevine: The Impact of Noise and the Breadth and Depth of Social Networks.” SSRN Scholarly Paper ID 3269543.
Jaidka, Chen, Chesterman, et al. 2024. Misinformation, Disinformation, and Generative AI: Implications for Perception and Policy.” Digit. Gov.: Res. Pract.
Jakesch, Garimella, Eckles, et al. 2021. #Trend Alert: How a Cross-Platform Organization Manipulated Twitter Trends in the Indian General Election.” arXiv:2104.13259 [Cs].
Johnson, Hollyn M., and Seifert. 1994. Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences.” Learning, Memory.
Johnson, N. F., Velasquez, Restrepo, et al. 2021. Mainstreaming of Conspiracy Theories and Misinformation.”
Kellow, and Steeves. 1998. The Role of Radio in the Rwandan Genocide.” Journal of Communication.
Kim. 2015. Does Disagreement Mitigate Polarization? How Selective Exposure and Disagreement Affect Political Polarization.” Journalism & Mass Communication Quarterly.
Klausen. 2015. Tweeting the Jihad: Social Media Networks of Western Foreign Fighters in Syria and Iraq.” Studies in Conflict & Terrorism.
Kreps. 2020. Social Media and International Relations.
LaFrance. 2020. The Prophecies of Q.” The Atlantic.
Larson. 2018. The Biggest Pandemic Risk? Viral Misinformation.” Nature.
Levy, and Razin. 2019. Echo Chambers and Their Effects on Economic and Political Outcomes.” Annual Review of Economics.
Lewis. n.d. “Broadcasting the Reactionary Right on YouTube.”
Lin, and Kerr. 2019. On Cyber-Enabled Information Warfare and Information Operations.” SSRN Scholarly Paper ID 3015680.
Machado, Kira, Narayanan, et al. 2019. A Study of Misinformation in WhatsApp Groups with a Focus on the Brazilian Presidential Elections. In Companion Proceedings of The 2019 World Wide Web Conference. WWW ’19.
Mahmoodi, Bang, Olsen, et al. 2015. Equality Bias Impairs Collective Decision-Making Across Cultures.” Proceedings of the National Academy of Sciences.
Martin, and Yurukoglu. 2017. Bias in Cable News: Persuasion and Polarization.” American Economic Review.
Marwick, and Lewis. 2017. Media Manipulation and Disinformation Online.”
Munn. 2019. Alt-Right Pipeline: Individual Journeys to Extremism Online.” First Monday.
Nyhan. 2021. Why the Backfire Effect Does Not Explain the Durability of Political Misperceptions.” Proceedings of the National Academy of Sciences.
O’Connor, and Weatherall. 2019. The Misinformation Age: How False Beliefs Spread.
Oliver, J. Eric, and Wood. 2014. Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science.
Oliver, Eric, and Wood. 2014. Larger Than Life.” New Scientist.
Osborne. 2022. Science Education in an Age of Misinformation.”
Ottman, Davis, Ottman, et al. 2022. The Censorship Effect.”
Parker. 2018. The Art of Gathering: How We Meet and Why It Matters.
Powell, and Weisman. 2018. Articulating Lay Theories Through Graphical Models: A Study of Beliefs Surrounding Vaccination Decisions.”
Radford, Wu, Child, et al. 2019. “Language Models Are Unsupervised Multitask Learners.”
Redlawsk, Civettini, and Emmerson. 2010. The Affective Tipping Point: Do Motivated Reasoners Ever ‘Get It’? Political Psychology.
Reyna. 2021. A Scientific Theory of Gist Communication and Misinformation Resistance, with Implications for Health, Education, and Policy.” Proceedings of the National Academy of Sciences.
Ribeiro, Ottoni, West, et al. 2019. Auditing Radicalization Pathways on YouTube.” arXiv:1908.08313 [Cs].
Richardson, Huynh, and Sotto. 2019. Get together: how to build a community with your people.
Rieder, Matamoros-Fernández, and Coromina. 2018. From Ranking Algorithms to ‘Ranking Cultures’: Investigating the Modulation of Visibility in YouTube Search Results.” Convergence.
Rogall. 2014. Mobilizing the Masses for Genocide.”
Roose. 2020. Get Ready for a Vaccine Information War.” The New York Times.
———. 2021. Inside Facebook’s Data Wars.” The New York Times.
Roozenbeek, and Linden. 2019. Fake News Game Confers Psychological Resistance Against Online Misinformation.” Palgrave Communications.
Salamanos, Jensen, Iordanou, et al. 2020. Did State-Sponsored Trolls Shape the 2016 US Presidential Election Discourse? Quantifying Influence on Twitter.”
Salganik, Dodds, and Watts. 2006. Experimental Study of Inequality and Unpredictability in an Artificial Cultural Market.” Science.
Salganik, and Watts. 2008. Leading the Herd Astray: An Experimental Study of Self-Fulfilling Prophecies in an Artificial Cultural Market.” Social Psychology Quarterly.
Schuchard, Crooks, Stefanidis, et al. 2019. Bot Stamina: Examining the Influence and Staying Power of Bots in Online Social Networks.” Applied Network Science.
Sharma, Hofman, and Watts. 2015. Estimating the Causal Impact of Recommendation Systems from Observational Data.” Proceedings of the Sixteenth ACM Conference on Economics and Computation - EC ’15.
Staines, and Moy. 2018. Tackling Misinformation in an Open Society.”
Starbird. 2017. Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter.” In Eleventh International AAAI Conference on Web and Social Media.
———. 2019. Disinformation’s Spread: Bots, Trolls and All of Us.” Nature.
Stewart, Leo Graiden, Arif, Nied, et al. 2017. Drawing the Lines of Contention: Networked Frame Contests Within #BlackLivesMatter Discourse.” Proceedings of the ACM on Human-Computer Interaction.
Stewart, Leo G, Arif, and Starbird. 2018. “Examining Trolls and Polarization with a Retweet Network.”
Taylor, and Hoffmann. n.d. Industry Responses to Computational Propaganda and Social Media Manipulation.”
Tufekci. 2014. Engineering the Public: Big Data, Surveillance and Computational Politics.” First Monday.
Uscinski, and Atkinson. 2013. Why Do People Believe in Conspiracy Theories? The Role of Informational Cues and Predispositions.” SSRN Scholarly Paper ID 2268782.
Verwimp. 2005. An Economic Profile of Peasant Perpetrators of Genocide: Micro-Level Evidence from Rwanda.” Journal of Development Economics.
Vosoughi, Roy, and Aral. 2018. The Spread of True and False News Online.” Science.
Watts, Duncan J. 2014. Common Sense and Sociological Explanations.” American Journal of Sociology.
Watts, Duncan J., and Dodds. 2007. Influentials, Networks, and Public Opinion Formation.” Journal of Consumer Research.
Watts, Duncan J, and Strogatz. 1998. Collective Dynamics of ‘Small-World’ Networks.” Nature.
Wilson, Zhou, and Starbird. 2018. Assembling Strategic Narratives: Information Operations As Collaborative Work Within an Online Community.” Proc. ACM Hum.-Comput. Interact.
Winter. 2019. Online Hate: From the Far-Right to the ‘Alt-Right’ and from the Margins to the Mainstream.” In Online Othering: Exploring Digital Violence and Discrimination on the Web. Palgrave Studies in Cybercrime and Cybersecurity.
Yang, Xu, Wu, et al. 2019. Read, Attend and Comment: A Deep Architecture for Automatic News Comment Generation.” arXiv:1909.11974 [Cs].