Weaponised social media

Trolls, bots, lulz, infowars and other moods of the modern networked ape

Information warfare on the social information graph, for the purpose of human behaviour control, various notes to that theme.

The other side to trusted news; hacking the implicit reputation system of social media to suborn factual reporting, or to motivate people to behave to suit your goals, to, e.g. sell uncertainty.

Research in this area is notably terrible, possibly because our tools of causality on social graphs are weak and it is hard, or perhaps, because the tools that some of us have are really good but people with really good tools to control the public are not going to mention that.

  • Techcrunch summary of the Facebook testing debacle

    every product, brand, politician, charity, and social movement is trying to manipulate your emotions on some level, and they’re running A/B tests to find out how. They all want you to use more, spend more, vote for them, donate money, or sign a petition by making you happy, insecure, optimistic, sad, or angry. There are many tools for discovering how best to manipulate these emotions, including analytics, focus groups, and A/B tests.

Fun keywords:

How it could be done

Strategies

Sea-lioning is a common hack for trolls, and is a whole interesting essay in strategic conversation deraillment strategies. Here is one strategy against it, the FAQ off system for live FAQ.

Evaluating

How do you observational inference of these systems? Of course, standard survey modelling. There is some structure to exploit here, e.g. causalimpact? How about when the data is a mixture of time-series data and one-off results (e.g. polling before and election and the election itself)

Automatic trolling, infinite fake news

The controversial GPT-2 (Radford et al. 2019)

GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets.

It takes 5 minutes to download this package and start generating decent fake news; Whether you gain anything over the traditional manual method is an open question.

The controversial deepcom model enables automatic comment generation for your fake news. (Yang et al. 2019)

Assembling these into a twitter bot farm is left as an exercise for the student.

How it’s being done

Facebook selling elections

Facebook selling elections

One firm promised to “use every tool and take every advantage available in order to change reality according to our client’s wishes.”

  • How Facebook Agreed to Suspend Netanyahu’s Chatbot
  • Aleksandr Kogan describes How Cambridge Analytica worked
  • Kate Starbird, the surprising nuance behind the Russian troll strategy
  • facebook political ad collector
  • Dan O’Sullivan, Inside the RNC Leak

    In what is the largest known data exposure of its kind, UpGuard’s Cyber Risk Team can now confirm that a misconfigured database containing the sensitive personal details of over 198 million American voters was left exposed to the internet by a firm working on behalf of the Republican National Committee (RNC) in their efforts to elect Donald Trump. The data, which was stored in a publicly accessible cloud server owned by Republican data firm Deep Root Analytics, included 1.1 terabytes of entirely unsecured personal information compiled by DRA and at least two other Republican contractors, TargetPoint Consulting, Inc. and Data Trust. In total, the personal information of potentially near all of America’s 200 million registered voters was exposed, including names, dates of birth, home addresses, phone numbers, and voter registration details, as well as data described as “modeled” voter ethnicities and religions.

    “‘Microtargeting is trying to unravel your political DNA,’ [Gage] said. ‘The more information I have about you, the better.’ The more information [Gage] has, the better he can group people into “target clusters” with names such as ‘Flag and Family Republicans’ or ‘Tax and Terrorism Moderates.’ Once a person is defined, finding the right message from the campaign becomes fairly simple.”

  • Trump campaign using targeted Facebook posts to discourage black Americans from voting:

    Businessweek, which published a major look into the campaign this morning, explains how the Trump team has quietly organized a data enterprise to sharpen its White House bid. According to the magazine, the campaign is meanwhile attempting to depress votes in demographics where Hillary Clinton is winning by wide margins.

  • Inside the Trump Bunker:

    Parscale was given a small budget to expand Trump’s base and decided to spend it all on Facebook. He developed rudimentary models, matching voters to their Facebook profiles and relying on that network’s “Lookalike Audiences” to expand his pool of targets. He ultimately placed $2 million in ads across several states, all from his laptop at home, then used the social network’s built-in “brand-lift” survey tool to gauge the effectiveness of his videos, which featured infographic-style explainers about his policy proposals or Trump speaking to the camera. “I always wonder why people in politics act like this stuff is so mystical,” Parscale says. “It’s the same shit we use in commercial, just has fancier names.”

Jonathan Stray, What tools do we have to combat disinformation?

Sarah Thompson, Fake Faces: People Who Do Not Exist Invade Facebook To Influence 2020 Elections is an interesting bit of meta analysis on Lead Stories

Lead Stories uses the Trendolizer™ engine to detect the most trending stories from known fake news, satire and prank websites and tries to debunk them as fast as possible.

(I think this is publicity/loss leader for Trendolizer, a media buzz product.)

How ISIS Games Twitter:

But what’s often overlooked in press coverage is that ISIS doesn’t just have strong, organic support online. It also employs social-media strategies that inflate and control its message. Extremists of all stripes are increasingly using social media to recruit, radicalize and raise funds, and ISIS is one of the most adept practitioners of this approach.

British army creates team of Facebook warriors

The Israel Defence Forces have pioneered state military engagement with social media, with dedicated teams operating since Operation Cast Lead, its war in Gaza in 2008-9. The IDF is active on 30 platforms – including Twitter, Facebook, Youtube and Instagram – in six languages. “It enables us to engage with an audience we otherwise wouldn’t reach,” said an Israeli army spokesman. During last summer’s war in Gaza, Operation Protective Edge, the IDF and Hamas’s military wing, the Qassam Brigades, tweeted prolifically, sometimes engaging directly with one another.

Aistrope, Tim. 2016. “Social Media and Counterterrorism Strategy.” Australian Journal of International Affairs 70 (2): 121–38. https://doi.org/10.1080/10357718.2015.1113230.

Arif, Ahmer, Leo Graiden Stewart, and Kate Starbird. 2018. “Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse.” Proc. ACM Hum.-Comput. Interact. 2 (CSCW): 20:1–20:27. https://doi.org/10.1145/3274289.

Bay, Morten. 2018. “Weaponizing the Haters: The Last Jedi and the Strategic Politicization of Pop Culture Through Social Media Manipulation.” First Monday 23 (11). https://doi.org/10.5210/fm.v23i11.9388.

Benkler, Yochai, Rob Faris, and Harold Roberts. 2018. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. New York, NY: Oxford University Press. http://fdslive.oup.com/www.oup.com/academic/pdf/openaccess/9780190923624.pdf.

Bessi, Alessandro. 2016. “On the Statistical Properties of Viral Misinformation in Online Social Media,” September. http://arxiv.org/abs/1609.09435.

Bradshaw, S., and P. Howard. 2017. “Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation” 2017.12. https://ora.ox.ac.uk/objects/uuid:cef7e8d9-27bf-4ea5-9fd6-855209b3e1f6.

Brito, Kellyton, Natalia Paula, Manoel Fernandes, and Silvio Meira. 2019. “Social Media and Presidential Campaigns – Preliminary Results of the 2018 Brazilian Presidential Election.” In Proceedings of the 20th Annual International Conference on Digital Government Research, 332–41. Dg.o 2019. Dubai, United Arab Emirates: ACM. https://doi.org/10.1145/3325112.3325252.

Broniatowski, David A., Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze. 2018. “Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate.” American Journal of Public Health 108 (10): 1378–84. https://doi.org/10.2105/AJPH.2018.304567.

Bursztyn, Victor S, and Larry Birnbaum. 2019. “Thousands of Small, Constant Rallies: A Large-Scale Analysis of Partisan WhatsApp Groups,” 6.

Cadwalladr, Carole. 2017. “The Great British Brexit Robbery: How Our Democracy Was Hijacked.” The Guardian: Technology, May 7, 2017. https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy.

Callaway, Duncan S., M. E. J. Newman, Steven H. Strogatz, and Duncan J. Watts. 2000. “Network Robustness and Fragility: Percolation on Random Graphs.” Physical Review Letters 85 (25): 5468–71. https://doi.org/10.1103/PhysRevLett.85.5468.

Coscia, Michele. 2017. “Popularity Spikes Hurt Future Chances for Viral Propagation of Protomemes.” Communications of the ACM 61 (1): 70–77. https://doi.org/10.1145/3158227.

Crooks, Andrew. n.d. “Bot Stamina: Examining the Influence and Staying Power of Bots in Online Social Networks.” Accessed August 15, 2019. https://www.gisagents.org/2019/08/bot-stamina-examining-influence-and.html.

Dittmar, Jeremiah E., and Skipper Seabold. 2015. “Media, Markets, and Radical Ideas: Evidence from the Protestant Reformation.” Centre for Economic Performance Working Paper. http://www.jeremiahdittmar.com/files/dittmar_seabold_print_religion.pdf.

Evans, David S. 2017. “The Economics of Attention Markets.” SSRN Scholarly Paper ID 3044858. Rochester, NY: Social Science Research Network. https://papers.ssrn.com/abstract=3044858.

Farrell, Henry. n.d. “Analysis | Blame Fox, Not Facebook, for Fake News.” Washington Post: Monkey Cage Analysis Analysis Interpretation of the News Based on Evidence, Including Data, as Well as Anticipating How Events Might Unfold Based on Past Events. Accessed September 16, 2019. https://www.washingtonpost.com/news/monkey-cage/wp/2018/11/06/blame-fox-not-facebook-for-fake-news/.

Farrell, Henry, and Bruce Schneier. 2018. “Common-Knowledge Attacks on Democracy.” SSRN Scholarly Paper ID 3273111. Rochester, NY: Social Science Research Network. https://papers.ssrn.com/abstract=3273111.

Goel, Sharad, Ashton Anderson, Jake Hofman, and Duncan J. Watts. 2015. “The Structural Virality of Online Diffusion.” Management Science, July, 150722112809007. https://doi.org/10.1287/mnsc.2015.2158.

Goel, Sharad, Jake M. Hofman, Sébastien Lahaie, David M. Pennock, and Duncan J. Watts. 2010. “Predicting Consumer Behavior with Web Search.” Proceedings of the National Academy of Sciences 107 (41): 17486–90. https://doi.org/10.1073/pnas.1005962107.

Goel, Sharad, Winter Mason, and Duncan J. Watts. 2010. “Real and Perceived Attitude Agreement in Social Networks.” Journal of Personality and Social Psychology 99 (4): 611–21. https://doi.org/10.1037/a0020697.

Goel, Sharad, Duncan J. Watts, and Daniel G. Goldstein. 2012. “The Structure of Online Diffusion Networks.” In Proceedings of the 13th ACM Conference on Electronic Commerce - EC ’12, 623. Valencia, Spain: ACM Press. https://doi.org/10.1145/2229012.2229058.

Gonzalez-Bailon, Sandra. 2009. “Opening the Black Box of Link Formation: Social Factors Underlying the Structure of the Web.” Social Networks 31 (4): 271–80. https://doi.org/10.1016/j.socnet.2009.07.003.

Granovetter, Mark. 1983. “The Strength of Weak Ties: A Network Theory Revisited.” Sociological Theory 1 (1): 201–33. http://www.soc.ucsb.edu/faculty/friedkin/Syllabi/Soc148/Granovetter%201983.pdf.

Granovetter, Mark S. 1973. “The Strength of Weak Ties.” The American Journal of Sociology 78 (6): 1360–80. https://doi.org/10.2307/2776392.

Grossman, Gene, and Elhanan Helpman. 2019. “Electoral Competition with Fake News.” w26409. Cambridge, MA: National Bureau of Economic Research. https://doi.org/10.3386/w26409.

Howard, Philip N., and Bence Kollanyi. 2016. “Bots, #StrongerIn, and #Brexit: Computational Propaganda During the UK-EU Referendum.” Browser Download This Paper. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2798311.

Hurd, T. R., and James P. Gleeson. 2012. “On Watts’ Cascade Model with Random Link Weights,” November. http://arxiv.org/abs/1211.5708.

Johnson, Hollyn M., and Colleen M. Seifert. 1994. “Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences.” Learning, Memory 20 (6): 1420–36. https://www.researchgate.net/profile/Colleen_Seifert/publication/232501255_Sources_of_the_Continued_Influence_Effect_When_Misinformation_in_Memory_Affects_Later_Inferences/links/5485d4070cf268d28f0045c0.pdf.

Klausen, Jytte. 2015. “Tweeting the Jihad: Social Media Networks of Western Foreign Fighters in Syria and Iraq.” Studies in Conflict & Terrorism 38 (1): 1–22. https://doi.org/10.1080/1057610X.2014.974948.

Larson, Heidi J. 2018. “The Biggest Pandemic Risk? Viral Misinformation.” Nature 562 (October): 309–9. https://doi.org/10.1038/d41586-018-07034-4.

Levy, Gilat, and Ronny Razin. 2019. “Echo Chambers and Their Effects on Economic and Political Outcomes.” Annual Review of Economics 11 (1): 303–28. https://doi.org/10.1146/annurev-economics-080218-030343.

Lewis, Rebecca. n.d. “Broadcasting the Reactionary Right on YouTube,” 61.

Lin, Herbert, and Jaclyn Kerr. 2019. “On Cyber-Enabled Information Warfare and Information Operations.” SSRN Scholarly Paper ID 3015680. Rochester, NY: Social Science Research Network. https://papers.ssrn.com/abstract=3015680.

Machado, Caio, Beatriz Kira, Vidya Narayanan, Bence Kollanyi, and Philip Howard. 2019. “A Study of Misinformation in WhatsApp Groups with a Focus on the Brazilian Presidential Elections.” In Companion Proceedings of the 2019 World Wide Web Conference, 1013–9. WWW ’19. San Francisco, USA: ACM. https://doi.org/10.1145/3308560.3316738.

Mahmoodi, Ali, Dan Bang, Karsten Olsen, Yuanyuan Aimee Zhao, Zhenhao Shi, Kristina Broberg, Shervin Safavi, et al. 2015. “Equality Bias Impairs Collective Decision-Making Across Cultures.” Proceedings of the National Academy of Sciences 112 (12): 3835–40. https://doi.org/10.1073/pnas.1421692112.

Martin, Gregory J., and Ali Yurukoglu. 2017. “Bias in Cable News: Persuasion and Polarization.” American Economic Review 107 (9): 2565–99. https://doi.org/10.1257/aer.20160812.

Marwick, Alice, and Rebecca Lewis. 2017. “Media Manipulation and Disinformation Online.” Data & Society Research Institute. https://datasociety.net/output/media-manipulation-and-disinfo-online/.

Munn, Luke. 2019. “Alt-Right Pipeline: Individual Journeys to Extremism Online.” First Monday 24 (6). https://doi.org/10.5210/fm.v24i6.10108.

O‘connor, Cailin, and James Owen Weatherall. 2019. The Misinformation Age: How False Beliefs Spread. 1 edition. New Haven: Yale University Press.

Oliver, Eric, and Tom Wood. 2014. “Larger Than Life.” New Scientist 224 (3000): 36–37. https://doi.org/10.1016/S0262-4079(14)62441-6.

Oliver, J. Eric, and Thomas J. Wood. 2014. “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science 58 (4): 952–66. https://doi.org/10.1111/ajps.12084.

Parker, Priya. 2018. The Art of Gathering: How We Meet and Why It Matters. International edition. New York: Riverhead Books.

Radford, Alec, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever. 2019. “Language Models Are Unsupervised Multitask Learners,” 24.

Redlawsk, David P., Andrew J. W. Civettini, and Karen M. Emmerson. 2010. “The Affective Tipping Point: Do Motivated Reasoners Ever ‘Get It’?” Political Psychology 31 (4): 563–93. https://doi.org/10.1111/j.1467-9221.2010.00772.x.

Ribeiro, Manoel Horta, Raphael Ottoni, Robert West, Virgílio A. F. Almeida, and Wagner Meira. 2019. “Auditing Radicalization Pathways on YouTube,” August. http://arxiv.org/abs/1908.08313.

Richardson, Bailey, Kevin Huynh, and Kai Elmer Sotto. 2019. Get Together: How to Build a Community with Your People.

Rieder, Bernhard, Ariadna Matamoros-Fernández, and Òscar Coromina. 2018. “From Ranking Algorithms to ‘Ranking Cultures’: Investigating the Modulation of Visibility in YouTube Search Results.” Convergence 24 (1): 50–68. https://doi.org/10.1177/1354856517736982.

Roozenbeek, Jon, and Sander van der Linden. 2019. “Fake News Game Confers Psychological Resistance Against Online Misinformation.” Palgrave Communications 5 (1): 1–10. https://doi.org/10.1057/s41599-019-0279-9.

Salganik, Matthew J., Peter Sheridan Dodds, and Duncan J. Watts. 2006. “Experimental Study of Inequality and Unpredictability in an Artificial Cultural Market.” Science 311 (5762): 854–56. https://doi.org/10.1126/science.1121066.

Salganik, Matthew J., and Duncan J. Watts. 2008. “Leading the Herd Astray: An Experimental Study of Self-Fulfilling Prophecies in an Artificial Cultural Market.” Social Psychology Quarterly 74 (4): 338. https://doi.org/10.1177/019027250807100404.

Schuchard, Ross, Andrew T. Crooks, Anthony Stefanidis, and Arie Croitoru. 2019. “Bot Stamina: Examining the Influence and Staying Power of Bots in Online Social Networks.” Applied Network Science 4 (1): 1–23. https://doi.org/10.1007/s41109-019-0164-x.

Sharma, Amit, Jake M. Hofman, and Duncan J. Watts. 2015. “Estimating the Causal Impact of Recommendation Systems from Observational Data.” Proceedings of the Sixteenth ACM Conference on Economics and Computation - EC ’15, 453–70. https://doi.org/10.1145/2764468.2764488.

Staines, Cassie, and Will Moy. 2018. “Tackling Misinformation in an Open Society.” https://fullfact.org/media/uploads/full_fact_tackling_misinformation_in_an_open_society.pdf.

Starbird, Kate. 2017. “Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter.” In Eleventh International AAAI Conference on Web and Social Media. https://www.aaai.org/ocs/index.php/ICWSM/ICWSM17/paper/view/15603.

———. 2019. “Disinformation’s Spread: Bots, Trolls and All of Us.” Nature 571 (July): 449. https://doi.org/10.1038/d41586-019-02235-x.

Stewart, Leo G, Ahmer Arif, and Kate Starbird. 2018. “Examining Trolls and Polarization with a Retweet Network,” 6.

Stewart, Leo Graiden, Ahmer Arif, A. Conrad Nied, Emma S. Spiro, and Kate Starbird. 2017. “Drawing the Lines of Contention: Networked Frame Contests Within #BlackLivesMatter Discourse.” Proceedings of the ACM on Human-Computer Interaction 1 (CSCW): 1–23. https://doi.org/10.1145/3134920.

Taylor, Emily, and Stacie Hoffmann. n.d. “Industry Responses to Computational Propaganda and Social Media Manipulation.” Oxford Information Labs. Accessed November 25, 2019. https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2019/11/Industry-Responses-Walsh-Hoffmann.pdf.

Verwimp, Philip. 2005. “An Economic Profile of Peasant Perpetrators of Genocide: Micro-Level Evidence from Rwanda.” Journal of Development Economics 77 (2): 297–323. https://doi.org/10.1016/j.jdeveco.2004.04.005.

Vosoughi, Soroush, Deb Roy, and Sinan Aral. 2018. “The Spread of True and False News Online.” Science 359 (6380): 1146–51. https://doi.org/10.1126/science.aap9559.

Watts, Duncan J. 2014. “Common Sense and Sociological Explanations.” American Journal of Sociology 120 (2): 313–51. https://doi.org/10.1086/678271.

Watts, Duncan J., and Peter Sheridan Dodds. 2007. “Influentials, Networks, and Public Opinion Formation.” Journal of Consumer Research 34 (4): 441–58. https://doi.org/10.1086/518527.

Watts, Duncan J, and Steven H Strogatz. 1998. “Collective Dynamics of ’Small-World’ Networks.” Nature 393 (6684): 440–42. https://doi.org/10.1038/30918.

Wilson, Tom, Kaitlyn Zhou, and Kate Starbird. 2018. “Assembling Strategic Narratives: Information Operations as Collaborative Work Within an Online Community.” Proc. ACM Hum.-Comput. Interact. 2 (CSCW): 183:1–183:26. https://doi.org/10.1145/3274452.

Winter, Aaron. 2019. “Online Hate: From the Far-Right to the ‘Alt-Right’ and from the Margins to the Mainstream.” In Online Othering: Exploring Digital Violence and Discrimination on the Web, edited by Karen Lumsden and Emily Harmer, 39–63. Palgrave Studies in Cybercrime and Cybersecurity. Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-12633-9_2.

Yang, Ze, Can Xu, Wei Wu, and Zhoujun Li. 2019. “Read, Attend and Comment: A Deep Architecture for Automatic News Comment Generation,” October. http://arxiv.org/abs/1909.11974.