Tribal sorting and polarization

Polarization and fragmentation by browser cookies

February 12, 2017 — October 8, 2022

collective knowledge
cooperation
distributed
game theory
how do science
insurgency
networks
social graph
sociology
squad
wonk
Figure 1

NB: This under construction Content has been recycled from an increasingly inaccurately renamed “filter bubbles” notebook. See also media weaponization, memetics, and epistemic communities etc.

1 In news

Theirtube

Theirtube is a Youtube filter bubble simulator that provides a look into how videos are recommended on other people’s YouTube. Users can experience how the YouTube home page would look for six different personas. Each persona simulates the viewing environment of real Youtube users who experienced being inside a recommendation bubble through recreating a Youtube account with a similar viewing history. TheirTube shows how YouTube’s recommendations can drastically shape someone’s experience on the platform and, as a result, shape their worldview. It is part of the Mozilla Creative Media Awards 2020 — art and advocacy project for examining AI’s effect on media and truth, developed by Tomo Kihara .

I think the proverb “Fish discover water last” can also be said about how we are blind to the nature of the recommendation bubble that we are in. Nowadays with an AI curating almost all of what we see, the only way for a person to get a better perspective on their own media environment is to see what others’ look like. By offering a tool to understand what the other recommendation bubbles look like, we hope to help people to get a better perspective on their own recommendation bubbles.

source

Figure 2: Write the news to suit your own biases using the FREE Scarfaset transfers

…newspapers such as the Scarfolk Mail realised that they no longer needed to provide actual content: Readers only saw what they wanted to see and comprehended what they wanted to comprehend.

“Data journalism” has created interesting tools. But do people care about data? Are facts persuasive? Even if facts are persuasive, are facts enough? As Gilad Lotan anecdotally illustrates, merely selecting facts can get you your own little reality, without even bothering to lie.

Figure 3: NYT data journalism on our geographic epistemic bubbles maps partisan sorting in Boston, from an article by Emily Badger, Kevin Quealy and Josh Katz

Let’s get real; for the moment, reasoned engagement with a shared rational enlightenment doesn’t dominate the media. Bread and circuses and kompromat and gut-instinct do.

Hugo Drochon argues that interest in conspiracy believers is itself a kind of collective hysteria and there are not so many conspiracy believers in The Conspiracy Theory Bubble. (He does argue for an increased salience of conspiracy believers.

Kate Starbird’s study of agents provocateur in online campaigns.

For a false binary, try Nick Cohen, Trump’s lies are not the problem. It’s the millions who swallow them who really matter:

Compulsive believers are not just rednecks. They include figures as elevated as the British prime minister and her cabinet. […]

Mainstream journalists are almost as credulous. After decades of imitating Jeremy Paxman and seizing on the trivial gaffes and small lies of largely harmless politicians, they are unable to cope with the fantastic lies of the new authoritarian movements. When confronted with men who lie so instinctively they believe their lies as they tell them, they can only insist on a fair hearing for the sake of “balance”. Their acceptance signals to the audience the unbelievable is worthy of belief.

This is a shallow causal analysis; However, thinking about the credulity of people in power is interesting.

Alexis Madrigal, What Facebook Did to American Democracy. Buzzfeed, Inside the partisan fight for your newsfeed:

The most comprehensive study to date of the growing universe of partisan websites and Facebook pages about US politics reveals that in 2016 alone at least 187 new websites launched, and that the candidacy and election of Donald Trump has unleashed a golden age of aggressive, divisive political content that reaches a massive amount of people on Facebook.

Thanks to a trinity of the internet, Facebook, and online advertising, partisan news websites and their associated Facebook pages are almost certainly making more money for more people and reaching more Americans than at any time in history. In some cases, publishers are generating hundreds of thousands of dollars a month in revenue, with small operations easily earning five figures thanks to one website and at least one associated Facebook page.

At its root, the analysis of 667 websites and 452 associated Facebook pages reveals the extent to which American online political discourse is powered by a mix of money and outrage.

(Goel, Mason, and Watts 2010):

It is often asserted that friends and acquaintances have more similar beliefs and attitudes than do strangers; yet empirical studies disagree over exactly how much diversity of opinion exists within local social networks and, relatedly, how much awareness individuals have of their neighbors’ views. This article reports results from a network survey, conducted on the Facebook social networking platform, in which participants were asked about their own political attitudes, as well as their beliefs about their friends’ attitudes. Although considerable attitude similarity exists among friends, the results show that friends disagree more than they think they do. In particular, friends are typically unaware of their disagreements, even when they say they discuss the topic, suggesting that discussion is not the primary means by which friends infer each other’s views on particular issues. Rather, it appears that respondents infer opinions in part by relying on stereotypes of their friends and in part by projecting their own views. The resulting gap between real and perceived agreement may have implications for the dynamics of political polarization and theories of social influence in general.

(Watts and Dodds 2007):

A central idea in marketing and diffusion research is that influentials— a minority of individuals who influence an exceptional number of their peers— are important to the formation of public opinion. Here we examine this idea, which we call the “influentials hypothesis,” using a series of computer simulations of interpersonal influence processes. Under most conditions that we consider, we find that large cascades of influence are driven not by influentials but by a critical mass of easily influenced individuals. Although our results do not exclude the possibility that influentials can be important, they suggest that the influentials hypothesis requires more careful specification and testing than it has received.

(Martin and Yurukoglu 2017:):

On the right, audiences concentrate attention on purely right wing outlets. On the left and center audiences spread their attention broadly and focus on mainstream organizations. This asymmetric pattern holds for the linking practices of media producers. Both supply and demand on the right are insular and self-focused. On the left and center they are spread broadly and anchored by professional press.

These differences create a different dynamic for media, audiences, and politicians on the left and right.

We all like to hear news that confirms our beliefs and identity. On the left, outlets and politicians try to attract readers by telling such stories but are constrained because their readers are exposed to a range of outlets, many of which operate with strong fact-checking norms.

On the right, because audiences do not trust or pay attention to outlets outside their own ecosystem, there is no reality check to constrain competition. Outlets compete on political purity and stoking identity-confirming narratives. Outlets and politicians who resist the flow by focusing on facts are abandoned or vilified by audiences and competing outlets. This forces media and political elites to validate and legitimate the falsehoods, at least through silence, creating a propaganda feedback loop. …

The highly asymmetric pattern of media ecosystems we observe on the right and the left, despite the fact that Facebook and Twitter usage is roughly similar on both sides, requires that we look elsewhere for what is causing the difference.

Surveys make it clear that Fox News is by far the most influential outlet on the American right — more than five times as many Trump supporters reported using Fox News as their primary news outlet than those who named Facebook. And Trump support was highest among demographics whose social media use was lowest.

Our data repeatedly show Fox as the transmission vector of widespread conspiracy theories.

This is a strong claim; I have seen more or less the contrary asserted about which side of politics is more insular— I suspect this is comes down to definitions of attention.

Farrell (n.d.) claim Fox News moved the 2008 presidential election Republican vote share by 6.3% to the right. (I have not read this article yet.

In other results, we estimate that removing Fox News from cable television during the 2000 election cycle would have reduced the overall Republican presidential vote share by 0.46 percentage points. The predicted effect increases in 2004 and 2008 to 3.59 and 6.34 percentage points, respectively. This increase is driven by increasing viewership on Fox News as well as an increasingly conservative slant.

I have not yet seen how much real analysis is done in (Garimella et al. 2018) about which is is claimed:

The study identifies three essential roles for Twitter users. Partisan users both consume and produce content with only a one-sided leaning and enjoy high appreciation measured by both network centrality and content endorsement. Gatekeepers have a central role in the formation of echo chambers because they consume content with diverse leanings but choose to produce only content with a one-sided leaning. Bipartisan users produce content with both leanings and make an effort to bridge the echo chambers, but they are less valued in their networks than their partisan counterparts.

A methodology that clusters into three discrete groups smells funky but I really need to read the paper to see what they actually did.

Maria Bustillos

So let’s say you’re a media owner who’s “all business,” caring only for the bottom line, looking to keep shareholders and owners happy, buy yourself some expensive houses, and/or get yourself or your friends re-elected. Without a doubt, the journalists who are prepared to tell readers the truth—even about you and your friends!—can only be a hindrance, and are best shut up or rid of, to the extent possible.

Figure 4: Computing scenarios for defusing polarized politics summarises Axelrod, Daymude, and Forrest (2021):. The effects of economic self-interest (P). Polarization of the population’s ideological positions over time with varying levels of economic self-interest, P = 0%, 1%, . . . , 10% (dark blue to yellow) that an actor will be attracted to its preferred (initial) position. (Left Inset) The initial normal distribution of actors’ ideological positions, which also represent their preferred positions when acting in self-interest. (Right Insets) Final configurations of the population after 2,500,000 steps for P = 0%, 1%, and 10%.

2 Game-theoretic models for

(Cohen 2012; Hales 2005; Hao and Leung 2011; Matlock and Sen 2007; McAvity et al. 2013; Oh 2001)

3 Incoming

  • NYU Stern Center for Business and Human Rights, Polarization Report

  • Ian Leslie, The differences of minor narcissists

    differences don’t cause conflicts; conflicts create differences. Members of a group seize on differences in order to affirm their own identity. A feedback loop ensues: differences are invented or enlarged, which stimulates further animosity, which magnifies differences, and so on.

  • Not Boring by Packy McCormick, Amplified Tribalism

  • Irrational Institutions #2 File under filter bubbles, reality bubbles, subculture dynamics.

  • Novoa et al. (2023)

    The use of category-referring statements, also known as generics (e.g., “Democrats want to defund the police”), may contribute to polarization by encouraging the adoption of broad conclusions about political categories that ignore variation within each political party. […] These findings suggest that the use of generic language, common in everyday speech, enables inferential errors that exacerbate perceived polarization.

  • The Outsiders - by Leighton Woodhouse

    In short, the autonomous pole is where cultural producers create cultural products expressly for other cultural producers. The heteronomous pole is where they produce for non-producers. Think Terence Malick versus Michael Bay.

4 References

Aistrope. 2016. Social Media and Counterterrorism Strategy.” Australian Journal of International Affairs.
Arguedas, Robertson, Fletche, et al. 2022. Echo Chambers, Filter Bubbles, and Polarisation: A Literature Review.”
Arif, Stewart, and Starbird. 2018. Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse.” Proc. ACM Hum.-Comput. Interact.
Ataöv. 1998. Narcissism of Minor Differences: Nurturing the ‘Clash of Civilizations’.” In.
Axelrod, Daymude, and Forrest. 2021. Preventing Extreme Polarization of Political Attitudes.” Proceedings of the National Academy of Sciences.
Banerjee, Chandrasekhar, Duflo, et al. 2019. Using Gossips to Spread Information: Theory and Evidence from Two Randomized Controlled Trials.” The Review of Economic Studies.
Benkler, Faris, and Roberts. 2018. Network propaganda: manipulation, disinformation, and radicalization in American politics.
Bessi. 2016. On the Statistical Properties of Viral Misinformation in Online Social Media.” arXiv:1609.09435 [Physics, Stat].
Boyd, and Richerson. 1990. Group Selection Among Alternative Evolutionarily Stable Strategies.” Journal of Theoretical Biology.
Bradshaw, and Howard. 2017. Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation.”
Brito, Paula, Fernandes, et al. 2019. Social Media and Presidential Campaigns – Preliminary Results of the 2018 Brazilian Presidential Election.” In Proceedings of the 20th Annual International Conference on Digital Government Research. Dg.o 2019.
Broniatowski, Jamison, Qi, et al. 2018. Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate.” American Journal of Public Health.
Bruner. 2021. Cooperation, Correlation and the Evolutionary Dominance of Tag-Based Strategies.” Biology & Philosophy.
Bursztyn, and Birnbaum. 2019. “Thousands of Small, Constant Rallies: A Large-Scale Analysis of Partisan WhatsApp Groups.”
Cadwalladr. 2017. The Great British Brexit Robbery: How Our Democracy Was Hijacked.” The Guardian.
Cha, Haddadi, Benevenuto, et al. 2010. Measuring User Influence in Twitter: The Million Follower Fallacy.” In Fourth International AAAI Conference on Weblogs and Social Media.
Cohen. 2012. The Evolution of Tag-Based Cooperation in Humans: The Case for Accent.” Current Anthropology.
Coscia. 2017. Popularity Spikes Hurt Future Chances for Viral Propagation of Protomemes.” Communications of the ACM.
Dittmar, and Seabold. 2015. Media, Markets, and Radical Ideas: Evidence from the Protestant Reformation.” Centre for Economic Performance Working Paper.
DuFord. 2022. Solidarity in Conflict: A Democratic Theory.
Evans. 2017. The Economics of Attention Markets.” SSRN Scholarly Paper ID 3044858.
Farrell. n.d. Analysis | Blame Fox, Not Facebook, for Fake News.” Washington Post.
Farrell, and Schneier. 2018. Common-Knowledge Attacks on Democracy.” SSRN Scholarly Paper ID 3273111.
Garimella, Morales, Gionis, et al. 2018. Political Discourse on Social Media: Echo Chambers, Gatekeepers, and the Price of Bipartisanship.” arXiv:1801.01665 [Cs].
Gelman. 2007. Struggles with Survey Weighting and Regression Modeling.” Statistical Science.
Goel, Anderson, Hofman, et al. 2015. The Structural Virality of Online Diffusion.” Management Science.
Goel, Mason, and Watts. 2010. Real and Perceived Attitude Agreement in Social Networks.” Journal of Personality and Social Psychology.
Goel, Watts, and Goldstein. 2012. The Structure of Online Diffusion Networks.” In Proceedings of the 13th ACM Conference on Electronic Commerce - EC ’12.
Golub, and Jackson. 2010. Naïve Learning in Social Networks and the Wisdom of Crowds.” American Economic Journal: Microeconomics.
———. 2011. Network Structure and the Speed of Learning: Measuring Homophily Based on Its Consequences.” SSRN Scholarly Paper ID 1784542.
———. 2012. How Homophily Affects the Speed of Learning and Best-Response Dynamics.” The Quarterly Journal of Economics.
Gonzalez-Bailon. 2009. Opening the Black Box of Link Formation: Social Factors Underlying the Structure of the Web.” Social Networks.
Granovetter, Mark S. 1973. The Strength of Weak Ties.” The American Journal of Sociology.
Granovetter, Mark. 1983. The Strength of Weak Ties: A Network Theory Revisited.” Sociological Theory.
Hales. 2005. Change Your Tags Fast! – A Necessary Condition for Cooperation? In Multi-Agent and Multi-Agent-Based Simulation. Lecture Notes in Computer Science.
Hao, and Leung. 2011. Learning to Achieve Social Rationality Using Tag Mechanism in Repeated Interactions.” In 2011 IEEE 23rd International Conference on Tools with Artificial Intelligence.
Harley. 1981. Learning the Evolutionarily Stable Strategy.” Journal of Theoretical Biology.
Howard, and Kollanyi. 2016. Bots, #StrongerIn, and #Brexit: Computational Propaganda During the UK-EU Referendum.” Browser Download This Paper.
Jackson. 2018. The Friendship Paradox and Systematic Biases in Perceptions and Social Norms.” Journal of Political Economy.
Jackson, Malladi, and McAdams. 2019. Learning Through the Grapevine: The Impact of Noise and the Breadth and Depth of Social Networks.” SSRN Scholarly Paper ID 3269543.
Johnson, and Seifert. 1994. Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences.” Learning, Memory.
Kempe, Kleinberg, and Tardos. 2003. Maximizing the Spread of Influence Through a Social Network.” In Proceedings of the Ninth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. KDD ’03.
Klausen. 2015. Tweeting the Jihad: Social Media Networks of Western Foreign Fighters in Syria and Iraq.” Studies in Conflict & Terrorism.
Lee, Karimi, Jo, et al. 2017. Homophily Explains Perception Biases in Social Networks.” arXiv:1710.08601 [Physics].
Lerman, Yan, and Wu. 2016. The ‘Majority Illusion’ in Social Networks.” PLOS ONE.
Leslie. 2021. Conflicted: How Productive Disagreements Lead to Better Outcomes.
Levy, and Razin. 2019. Echo Chambers and Their Effects on Economic and Political Outcomes.” Annual Review of Economics.
Lewis. n.d. “Broadcasting the Reactionary Right on YouTube.”
Lin, and Kerr. 2019. On Cyber-Enabled Information Warfare and Information Operations.” SSRN Scholarly Paper ID 3015680.
Machado, Kira, Narayanan, et al. 2019. A Study of Misinformation in WhatsApp Groups with a Focus on the Brazilian Presidential Elections. In Companion Proceedings of The 2019 World Wide Web Conference. WWW ’19.
Martin, and Yurukoglu. 2017. Bias in Cable News: Persuasion and Polarization.” American Economic Review.
Marwick, and Lewis. 2017. Media Manipulation and Disinformation Online.”
Matlock, and Sen. 2007. Effective Tag Mechanisms for Evolving Coordination.” In Proceedings of the 6th International Joint Conference on Autonomous Agents and Multiagent Systems. AAMAS ’07.
McAvity, Bristow, Bunker, et al. 2013. Perception Without Self-Matching in Conditional Tag Based Cooperation.” Journal of Theoretical Biology.
Munn. 2019. Alt-Right Pipeline: Individual Journeys to Extremism Online.” First Monday.
Novoa, Echelbarger, Gelman, et al. 2023. Generically Partisan: Polarization in Political Communication.” Proceedings of the National Academy of Sciences.
O’Connor, and Weatherall. 2019. The Misinformation Age: How False Beliefs Spread.
Oh. 2001. Promoting Cooperation Using `Kin’ Biased Conditional Strategy in the Iterated Prisoner’s Dilemma Game.” Information Sciences, Evolutionary Algorithms,.
Oliver, J. Eric, and Wood. 2014. Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science.
Oliver, Eric, and Wood. 2014. Larger Than Life.” New Scientist.
Olteanu, Castillo, Diaz, et al. 2019. Social Data: Biases, Methodological Pitfalls, and Ethical Boundaries.” Frontiers in Big Data.
Redlawsk, Civettini, and Emmerson. 2010. The Affective Tipping Point: Do Motivated Reasoners Ever ‘Get It’? Political Psychology.
Ribeiro, Ottoni, West, et al. 2019. Auditing Radicalization Pathways on YouTube.” arXiv:1908.08313 [Cs].
Rieder, Matamoros-Fernández, and Coromina. 2018. From Ranking Algorithms to ‘Ranking Cultures’: Investigating the Modulation of Visibility in YouTube Search Results.” Convergence.
Röttger, and Vedres. 2020. “The Information Environment and Its Effects on Individuals and Groups.”
Salganik, and Watts. 2008. Leading the Herd Astray: An Experimental Study of Self-Fulfilling Prophecies in an Artificial Cultural Market.” Social Psychology Quarterly.
Schuchard, Crooks, Stefanidis, et al. 2019. Bot Stamina: Examining the Influence and Staying Power of Bots in Online Social Networks.” Applied Network Science.
Shakya, and Christakis. 2017. Association of Facebook Use With Compromised Well-Being: A Longitudinal Study.” American Journal of Epidemiology.
Softky, and Benford. 2017. Sensory Metrics of Neuromechanical Trust.” Neural Computation.
Staines, and Moy. 2018. Tackling Misinformation in an Open Society.”
Starbird. 2017. Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter.” In Eleventh International AAAI Conference on Web and Social Media.
———. 2019. Disinformation’s Spread: Bots, Trolls and All of Us.” Nature.
Stewart, Leo Graiden, Arif, Nied, et al. 2017. Drawing the Lines of Contention: Networked Frame Contests Within #BlackLivesMatter Discourse.” Proceedings of the ACM on Human-Computer Interaction.
Stewart, Leo G, Arif, and Starbird. 2018. “Examining Trolls and Polarization with a Retweet Network.”
Tajfel. 1982. Social Psychology of Intergroup Relations.” Annual Review of Psychology.
Talisse. 2021. Sustaining democracy: what we owe to the other side.
Tokita, Guess, and Tarnita. 2021. Polarized Information Ecosystems Can Reorganize Social Networks via Information Cascades.” Proceedings of the National Academy of Sciences.
Valente, and Pitts. 2017. An Appraisal of Social Network Theory and Analysis as Applied to Public Health: Challenges and Opportunities.” Annual Review of Public Health.
Vosoughi, Roy, and Aral. 2018. The Spread of True and False News Online.” Science.
Watts, and Dodds. 2007. Influentials, Networks, and Public Opinion Formation.” Journal of Consumer Research.
West, and Bergstrom. 2011. Can Ignorance Promote Democracy? Science.
Wilson, Zhou, and Starbird. 2018. Assembling Strategic Narratives: Information Operations As Collaborative Work Within an Online Community.” Proc. ACM Hum.-Comput. Interact.
Winter. 2019. Online Hate: From the Far-Right to the ‘Alt-Right’ and from the Margins to the Mainstream.” In Online Othering: Exploring Digital Violence and Discrimination on the Web. Palgrave Studies in Cybercrime and Cybersecurity.