Tribal sorting and polarization

Polarization and fragmentation by browser cookies

NB: This under construction Content has been recycled from an increasingly inaccurately renamed “filter bubbles” notebook. See also media weaponization, memetics, and epistemic communities etc.

In news


Theirtube is a Youtube filter bubble simulator that provides a look into how videos are recommended on other people’s YouTube. Users can experience how the YouTube home page would look for six different personas. Each persona simulates the viewing environment of real Youtube users who experienced being inside a recommendation bubble through recreating a Youtube account with a similar viewing history. TheirTube shows how YouTube’s recommendations can drastically shape someone’s experience on the platform and, as a result, shape their worldview. It is part of the Mozilla Creative Media Awards 2020 — art and advocacy project for examining AI’s effect on media and truth, developed by Tomo Kihara .

I think the proverb “Fish discover water last” can also be said about how we are blind to the nature of the recommendation bubble that we are in. Nowadays with an AI curating almost all of what we see, the only way for a person to get a better perspective on their own media environment is to see what others’ look like. By offering a tool to understand what the other recommendation bubbles look like, we hope to help people to get a better perspective on their own recommendation bubbles.


Write the news to suit your own biases using the FREE Scarfaset transfers

…newspapers such as the Scarfolk Mail realised that they no longer needed to provide actual content: Readers only saw what they wanted to see and comprehended what they wanted to comprehend.

“Data journalism” has created interesting tools. But do people care about data? Are facts persuasive? Even if facts are persuasive, are facts enough? As Gilad Lotan anecdotally illustrates, merely selecting facts can get you your own little reality, without even bothering to lie.

NYT data journalism on our geographic epistemic bubbles maps partisan sorting in Boston, from an article by Emily Badger, Kevin Quealy and Josh Katz

Let’s get real; for the moment, reasoned engagement with a shared rational enlightenment doesn’t dominate the media. Bread and circuses and kompromat and gut-instinct do.

Hugo Drochon argues that interest in conspiracy believers is itself a kind of collective hysteria and there are not so many conspiracy believers in The Conspiracy Theory Bubble. (He does argue for an increased salience of conspiracy believers.

Kate Starbird’s study of agents provocateur in online campaigns.

For a false binary, try Nick Cohen, Trump’s lies are not the problem. It’s the millions who swallow them who really matter:

Compulsive believers are not just rednecks. They include figures as elevated as the British prime minister and her cabinet. […]

Mainstream journalists are almost as credulous. After decades of imitating Jeremy Paxman and seizing on the trivial gaffes and small lies of largely harmless politicians, they are unable to cope with the fantastic lies of the new authoritarian movements. When confronted with men who lie so instinctively they believe their lies as they tell them, they can only insist on a fair hearing for the sake of “balance”. Their acceptance signals to the audience the unbelievable is worthy of belief.

This is a shallow causal analysis; However, thinking about the credulity of people in power is interesting.

Alexis Madrigal, What Facebook Did to American Democracy. Buzzfeed, Inside the partisan fight for your newsfeed:

The most comprehensive study to date of the growing universe of partisan websites and Facebook pages about US politics reveals that in 2016 alone at least 187 new websites launched, and that the candidacy and election of Donald Trump has unleashed a golden age of aggressive, divisive political content that reaches a massive amount of people on Facebook.

Thanks to a trinity of the internet, Facebook, and online advertising, partisan news websites and their associated Facebook pages are almost certainly making more money for more people and reaching more Americans than at any time in history. In some cases, publishers are generating hundreds of thousands of dollars a month in revenue, with small operations easily earning five figures thanks to one website and at least one associated Facebook page.

At its root, the analysis of 667 websites and 452 associated Facebook pages reveals the extent to which American online political discourse is powered by a mix of money and outrage.

(Goel, Mason, and Watts 2010):

It is often asserted that friends and acquaintances have more similar beliefs and attitudes than do strangers; yet empirical studies disagree over exactly how much diversity of opinion exists within local social networks and, relatedly, how much awareness individuals have of their neighbors’ views. This article reports results from a network survey, conducted on the Facebook social networking platform, in which participants were asked about their own political attitudes, as well as their beliefs about their friends’ attitudes. Although considerable attitude similarity exists among friends, the results show that friends disagree more than they think they do. In particular, friends are typically unaware of their disagreements, even when they say they discuss the topic, suggesting that discussion is not the primary means by which friends infer each other’s views on particular issues. Rather, it appears that respondents infer opinions in part by relying on stereotypes of their friends and in part by projecting their own views. The resulting gap between real and perceived agreement may have implications for the dynamics of political polarization and theories of social influence in general.

(Watts and Dodds 2007):

A central idea in marketing and diffusion research is that influentials— a minority of individuals who influence an exceptional number of their peers— are important to the formation of public opinion. Here we examine this idea, which we call the “influentials hypothesis,” using a series of computer simulations of interpersonal influence processes. Under most conditions that we consider, we find that large cascades of influence are driven not by influentials but by a critical mass of easily influenced individuals. Although our results do not exclude the possibility that influentials can be important, they suggest that the influentials hypothesis requires more careful specification and testing than it has received.

(Martin and Yurukoglu 2017:):

On the right, audiences concentrate attention on purely right wing outlets. On the left and center audiences spread their attention broadly and focus on mainstream organizations. This asymmetric pattern holds for the linking practices of media producers. Both supply and demand on the right are insular and self-focused. On the left and center they are spread broadly and anchored by professional press.

These differences create a different dynamic for media, audiences, and politicians on the left and right.

We all like to hear news that confirms our beliefs and identity. On the left, outlets and politicians try to attract readers by telling such stories but are constrained because their readers are exposed to a range of outlets, many of which operate with strong fact-checking norms.

On the right, because audiences do not trust or pay attention to outlets outside their own ecosystem, there is no reality check to constrain competition. Outlets compete on political purity and stoking identity-confirming narratives. Outlets and politicians who resist the flow by focusing on facts are abandoned or vilified by audiences and competing outlets. This forces media and political elites to validate and legitimate the falsehoods, at least through silence, creating a propaganda feedback loop. …

The highly asymmetric pattern of media ecosystems we observe on the right and the left, despite the fact that Facebook and Twitter usage is roughly similar on both sides, requires that we look elsewhere for what is causing the difference.

Surveys make it clear that Fox News is by far the most influential outlet on the American right — more than five times as many Trump supporters reported using Fox News as their primary news outlet than those who named Facebook. And Trump support was highest among demographics whose social media use was lowest.

Our data repeatedly show Fox as the transmission vector of widespread conspiracy theories.

This is a strong claim; I have seen more or less the contrary asserted about which side of politics is more insular— I suspect this is comes down to definitions of attention.

Farrell (n.d.) claim Fox News moved the 2008 presidential election Republican vote share by 6.3% to the right. (I have not read this article yet.

In other results, we estimate that removing Fox News from cable television during the 2000 election cycle would have reduced the overall Republican presidential vote share by 0.46 percentage points. The predicted effect increases in 2004 and 2008 to 3.59 and 6.34 percentage points, respectively. This increase is driven by increasing viewership on Fox News as well as an increasingly conservative slant.

I have not yet seen how much real analysis is done in (Garimella et al. 2018) about which is is claimed:

The study identifies three essential roles for Twitter users. Partisan users both consume and produce content with only a one-sided leaning and enjoy high appreciation measured by both network centrality and content endorsement. Gatekeepers have a central role in the formation of echo chambers because they consume content with diverse leanings but choose to produce only content with a one-sided leaning. Bipartisan users produce content with both leanings and make an effort to bridge the echo chambers, but they are less valued in their networks than their partisan counterparts.

A methodology that clusters into three discrete groups smells funky but I really need to read the paper to see what they actually did.

Maria Bustillos

So let’s say you’re a media owner who’s “all business,” caring only for the bottom line, looking to keep shareholders and owners happy, buy yourself some expensive houses, and/or get yourself or your friends re-elected. Without a doubt, the journalists who are prepared to tell readers the truth—even about you and your friends!—can only be a hindrance, and are best shut up or rid of, to the extent possible.

Computing scenarios for defusing polarized politics summarises Axelrod, Daymude, and Forrest (2021):. The effects of economic self-interest (P). Polarization of the population’s ideological positions over time with varying levels of economic self-interest, P = 0%, 1%, . . . , 10% (dark blue to yellow) that an actor will be attracted to its preferred (initial) position. (Left Inset) The initial normal distribution of actors’ ideological positions, which also represent their preferred positions when acting in self-interest. (Right Insets) Final configurations of the population after 2,500,000 steps for P = 0%, 1%, and 10%.


  • NYU Stern Center for Business and Human Rights, Polarization Report

  • Ian Leslie, The differences of minor narcissists

    differences don’t cause conflicts; conflicts create differences. Members of a group seize on differences in order to affirm their own identity. A feedback loop ensues: differences are invented or enlarged, which stimulates further animosity, which magnifies differences, and so on.

  • Not Boring by Packy McCormick, Amplified Tribalism

  • Irrational Institutions #2 File under filter bubbles, reality bubbles, subculture dynamics.

  • Novoa et al. (2023)

    The use of category-referring statements, also known as generics (e.g., “Democrats want to defund the police”), may contribute to polarization by encouraging the adoption of broad conclusions about political categories that ignore variation within each political party. […] These findings suggest that the use of generic language, common in everyday speech, enables inferential errors that exacerbate perceived polarization.


Aistrope, Tim. 2016. Social Media and Counterterrorism Strategy.” Australian Journal of International Affairs 70 (2): 121–38.
Arguedas, Amy Ross, Craig T. Robertson, Richard Fletche, and Rasmus K. Nielsen. 2022. Echo Chambers, Filter Bubbles, and Polarisation: A Literature Review.”
Arif, Ahmer, Leo Graiden Stewart, and Kate Starbird. 2018. Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse.” Proc. ACM Hum.-Comput. Interact. 2 (CSCW): 20:1–27.
Ataöv, Türkkaya. 1998. Narcissism of Minor Differences: Nurturing the ‘Clash of Civilizations’.” In. na.
Axelrod, Robert, Joshua J. Daymude, and Stephanie Forrest. 2021. Preventing Extreme Polarization of Political Attitudes.” Proceedings of the National Academy of Sciences 118 (50).
Banerjee, Abhijit, Arun G Chandrasekhar, Esther Duflo, and Matthew O Jackson. 2019. Using Gossips to Spread Information: Theory and Evidence from Two Randomized Controlled Trials.” The Review of Economic Studies 86 (6): 2453–90.
Benkler, Yochai, Rob Faris, and Harold Roberts. 2018. Network propaganda: manipulation, disinformation, and radicalization in American politics. New York, NY: Oxford University Press.
Bessi, Alessandro. 2016. On the Statistical Properties of Viral Misinformation in Online Social Media.” arXiv:1609.09435 [Physics, Stat], September.
Bradshaw, S., and P. Howard. 2017. Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation 2017.12.
Brito, Kellyton, Natalia Paula, Manoel Fernandes, and Silvio Meira. 2019. Social Media and Presidential Campaigns – Preliminary Results of the 2018 Brazilian Presidential Election.” In Proceedings of the 20th Annual International Conference on Digital Government Research, 332–41. Dg.o 2019. New York, NY, USA: ACM.
Broniatowski, David A., Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze. 2018. Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate.” American Journal of Public Health 108 (10): 1378–84.
Bursztyn, Victor S, and Larry Birnbaum. 2019. “Thousands of Small, Constant Rallies: A Large-Scale Analysis of Partisan WhatsApp Groups,” 6.
Cadwalladr, Carole. 2017. The Great British Brexit Robbery: How Our Democracy Was Hijacked.” The Guardian, May 7, 2017, sec. Technology.
Cha, Meeyoung, Hamed Haddadi, Fabricio Benevenuto, and Krishna P. Gummadi. 2010. Measuring User Influence in Twitter: The Million Follower Fallacy.” In Fourth International AAAI Conference on Weblogs and Social Media.
Coscia, Michele. 2017. Popularity Spikes Hurt Future Chances for Viral Propagation of Protomemes.” Communications of the ACM 61 (1): 70–77.
Dittmar, Jeremiah E., and Skipper Seabold. 2015. Media, Markets, and Radical Ideas: Evidence from the Protestant Reformation.” Centre for Economic Performance Working Paper.
DuFord, Rochelle. 2022. Solidarity in Conflict: A Democratic Theory. Stanford: Stanford University Press.
Evans, David S. 2017. The Economics of Attention Markets.” SSRN Scholarly Paper ID 3044858. Rochester, NY: Social Science Research Network.
Farrell, Henry. n.d. Analysis | Blame Fox, Not Facebook, for Fake News.” Washington Post, sec. Monkey Cage Analysis Analysis Interpretation of the news based on evidence, including data, as well as anticipating how events might unfold based on past events.
Farrell, Henry, and Bruce Schneier. 2018. Common-Knowledge Attacks on Democracy.” SSRN Scholarly Paper ID 3273111. Rochester, NY: Social Science Research Network.
Garimella, Kiran, Gianmarco De Francisci Morales, Aristides Gionis, and Michael Mathioudakis. 2018. Political Discourse on Social Media: Echo Chambers, Gatekeepers, and the Price of Bipartisanship.” arXiv:1801.01665 [Cs], February.
Gelman, Andrew. 2007. Struggles with Survey Weighting and Regression Modeling.” Statistical Science 22 (2): 153–64.
Goel, Sharad, Ashton Anderson, Jake Hofman, and Duncan J. Watts. 2015. The Structural Virality of Online Diffusion.” Management Science, July, 150722112809007.
Goel, Sharad, Winter Mason, and Duncan J. Watts. 2010. Real and Perceived Attitude Agreement in Social Networks.” Journal of Personality and Social Psychology 99 (4): 611–21.
Goel, Sharad, Duncan J. Watts, and Daniel G. Goldstein. 2012. The Structure of Online Diffusion Networks.” In Proceedings of the 13th ACM Conference on Electronic Commerce - EC ’12, 623. Valencia, Spain: ACM Press.
Golub, Benjamin, and Matthew O. Jackson. 2010. Naïve Learning in Social Networks and the Wisdom of Crowds.” American Economic Journal: Microeconomics 2 (1): 112–49.
———. 2011. Network Structure and the Speed of Learning: Measuring Homophily Based on Its Consequences.” SSRN Scholarly Paper ID 1784542. Rochester, NY: Social Science Research Network.
———. 2012. How Homophily Affects the Speed of Learning and Best-Response Dynamics.” The Quarterly Journal of Economics 127 (3): 1287–1338.
Gonzalez-Bailon, Sandra. 2009. Opening the Black Box of Link Formation: Social Factors Underlying the Structure of the Web.” Social Networks 31 (4): 271–80.
Granovetter, Mark. 1983. The Strength of Weak Ties: A Network Theory Revisited.” Sociological Theory 1 (1): 201–33.
Granovetter, Mark S. 1973. The Strength of Weak Ties.” The American Journal of Sociology 78 (6): 1360–80.
Howard, Philip N., and Bence Kollanyi. 2016. Bots, #StrongerIn, and #Brexit: Computational Propaganda During the UK-EU Referendum.” Browser Download This Paper.
Jackson, Matthew O. 2018. The Friendship Paradox and Systematic Biases in Perceptions and Social Norms.” Journal of Political Economy 127 (2): 777–818.
Jackson, Matthew O., Suraj Malladi, and David McAdams. 2019. Learning Through the Grapevine: The Impact of Noise and the Breadth and Depth of Social Networks.” SSRN Scholarly Paper ID 3269543. Rochester, NY: Social Science Research Network.
Johnson, Hollyn M., and Colleen M. Seifert. 1994. Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences.” Learning, Memory 20 (6): 1420–36.
Kempe, David, Jon Kleinberg, and Éva Tardos. 2003. Maximizing the Spread of Influence Through a Social Network.” In Proceedings of the Ninth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 137–46. KDD ’03. New York, NY, USA: ACM.
Klausen, Jytte. 2015. Tweeting the Jihad: Social Media Networks of Western Foreign Fighters in Syria and Iraq.” Studies in Conflict & Terrorism 38 (1): 1–22.
Lee, Eun, Fariba Karimi, Hang-Hyun Jo, Markus Strohmaier, and Claudia Wagner. 2017. Homophily Explains Perception Biases in Social Networks.” arXiv:1710.08601 [Physics], October.
Lerman, Kristina, Xiaoran Yan, and Xin-Zeng Wu. 2016. The ‘Majority Illusion’ in Social Networks.” PLOS ONE 11 (2): e0147617.
Leslie, Ian. 2021. Conflicted: How Productive Disagreements Lead to Better Outcomes. New York, NY: Harper Business.
Levy, Gilat, and Ronny Razin. 2019. Echo Chambers and Their Effects on Economic and Political Outcomes.” Annual Review of Economics 11 (1): 303–28.
Lewis, Rebecca. n.d. “Broadcasting the Reactionary Right on YouTube,” 61.
Lin, Herbert, and Jaclyn Kerr. 2019. On Cyber-Enabled Information Warfare and Information Operations.” SSRN Scholarly Paper ID 3015680. Rochester, NY: Social Science Research Network.
Machado, Caio, Beatriz Kira, Vidya Narayanan, Bence Kollanyi, and Philip Howard. 2019. A Study of Misinformation in WhatsApp Groups with a Focus on the Brazilian Presidential Elections. In Companion Proceedings of The 2019 World Wide Web Conference, 1013–19. WWW ’19. New York, NY, USA: ACM.
Martin, Gregory J., and Ali Yurukoglu. 2017. Bias in Cable News: Persuasion and Polarization.” American Economic Review 107 (9): 2565–99.
Marwick, Alice, and Rebecca Lewis. 2017. Media Manipulation and Disinformation Online.” Data & Society Research Institute.
Munn, Luke. 2019. Alt-Right Pipeline: Individual Journeys to Extremism Online.” First Monday 24 (6).
Novoa, Gustavo, Margaret Echelbarger, Andrew Gelman, and Susan A. Gelman. 2023. Generically Partisan: Polarization in Political Communication.” Proceedings of the National Academy of Sciences 120 (47): e2309361120.
O’Connor, Cailin, and James Owen Weatherall. 2019. The Misinformation Age: How False Beliefs Spread. 1 edition. New Haven: Yale University Press.
Oliver, Eric, and Tom Wood. 2014. Larger Than Life.” New Scientist 224 (3000): 36–37.
Oliver, J. Eric, and Thomas J. Wood. 2014. Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science 58 (4): 952–66.
Olteanu, Alexandra, Carlos Castillo, Fernando Diaz, and Emre Kıcıman. 2019. Social Data: Biases, Methodological Pitfalls, and Ethical Boundaries.” Frontiers in Big Data 2.
Redlawsk, David P., Andrew J. W. Civettini, and Karen M. Emmerson. 2010. The Affective Tipping Point: Do Motivated Reasoners Ever ‘Get It’? Political Psychology 31 (4): 563–93.
Ribeiro, Manoel Horta, Raphael Ottoni, Robert West, Virgílio A. F. Almeida, and Wagner Meira. 2019. Auditing Radicalization Pathways on YouTube.” arXiv:1908.08313 [Cs], August.
Rieder, Bernhard, Ariadna Matamoros-Fernández, and Òscar Coromina. 2018. From Ranking Algorithms to ‘Ranking Cultures’: Investigating the Modulation of Visibility in YouTube Search Results.” Convergence 24 (1): 50–68.
Röttger, Paul, and Balazs Vedres. 2020. “The Information Environment and Its Effects on Individuals and Groups,” 60.
Salganik, Matthew J., and Duncan J. Watts. 2008. Leading the Herd Astray: An Experimental Study of Self-Fulfilling Prophecies in an Artificial Cultural Market.” Social Psychology Quarterly 74 (4): 338.
Schuchard, Ross, Andrew T. Crooks, Anthony Stefanidis, and Arie Croitoru. 2019. Bot Stamina: Examining the Influence and Staying Power of Bots in Online Social Networks.” Applied Network Science 4 (1): 1–23.
Shakya, Holly B., and Nicholas A. Christakis. 2017. Association of Facebook Use With Compromised Well-Being: A Longitudinal Study.” American Journal of Epidemiology 185 (3).
Softky, William, and Criscillia Benford. 2017. Sensory Metrics of Neuromechanical Trust.” Neural Computation 29 (9): 2293–2351.
Staines, Cassie, and Will Moy. 2018. Tackling Misinformation in an Open Society.”
Starbird, Kate. 2017. Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter.” In Eleventh International AAAI Conference on Web and Social Media.
———. 2019. Disinformation’s Spread: Bots, Trolls and All of Us.” Nature 571 (July): 449.
Stewart, Leo G, Ahmer Arif, and Kate Starbird. 2018. “Examining Trolls and Polarization with a Retweet Network,” 6.
Stewart, Leo Graiden, Ahmer Arif, A. Conrad Nied, Emma S. Spiro, and Kate Starbird. 2017. Drawing the Lines of Contention: Networked Frame Contests Within #BlackLivesMatter Discourse.” Proceedings of the ACM on Human-Computer Interaction 1 (CSCW): 1–23.
Tajfel, Henri. 1982. Social Psychology of Intergroup Relations.” Annual Review of Psychology 33: 1–39.
Talisse, Robert B. 2021. Sustaining democracy: what we owe to the other side. New York, NY, United States of America: Oxford University Press.
Tokita, Christopher K., Andrew M. Guess, and Corina E. Tarnita. 2021. Polarized Information Ecosystems Can Reorganize Social Networks via Information Cascades.” Proceedings of the National Academy of Sciences 118 (50).
Valente, Thomas W., and Stephanie R. Pitts. 2017. An Appraisal of Social Network Theory and Analysis as Applied to Public Health: Challenges and Opportunities.” Annual Review of Public Health 38 (1): 103–18.
Vosoughi, Soroush, Deb Roy, and Sinan Aral. 2018. The Spread of True and False News Online.” Science 359 (6380): 1146–51.
Watts, Duncan J., and Peter Sheridan Dodds. 2007. Influentials, Networks, and Public Opinion Formation.” Journal of Consumer Research 34 (4): 441–58.
West, Jevin D., and Carl T. Bergstrom. 2011. Can Ignorance Promote Democracy? Science 334 (6062): 1503–4.
Wilson, Tom, Kaitlyn Zhou, and Kate Starbird. 2018. Assembling Strategic Narratives: Information Operations As Collaborative Work Within an Online Community.” Proc. ACM Hum.-Comput. Interact. 2 (CSCW): 183:1–26.
Winter, Aaron. 2019. Online Hate: From the Far-Right to the ‘Alt-Right’ and from the Margins to the Mainstream.” In Online Othering: Exploring Digital Violence and Discrimination on the Web, edited by Karen Lumsden and Emily Harmer, 39–63. Palgrave Studies in Cybercrime and Cybersecurity. Cham: Springer International Publishing.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.