Designing less cruel social media

2020-07-11 — 2026-03-09

Wherein an iterative remaking of online commons is proposed, with IndieWeb and Fediverse noted, and platform rules are to be set by compensated mini-publics convened for moderation and privacy.

confidentiality
democracy
distributed
diy
economics
evolution
game theory
insurgency
networks
P2P
wonk
Figure 1

What if we could do less harm minimization of social network behaviour because social networks were less toxic, less addictive, and less fruitful for weaponized corrosiveness?

What might such networks look like? Since this is an evolutionary process, I suspect we need to consider an iterative design process, rather than hoping to nail the perfect solution in one go. Maybe think about everything here as a plausible “next move” rather than the “end game”.

1 Indieweb

The artisanal social media movement is a good start. See Indieweb for more.

2 Fediverse

See Fediverse for more.

3 Platform democracy

See Towards Platform Democracy: Policymaking Beyond Corporate CEOs and Partisan Pressure for more.

Facebook, YouTube, and other platforms make incredibly impactful decisions about the speech of billions. Right now, those decisions are primarily in the hands of corporate CEO’s—and heavily influenced by pressure from partisan and authoritarian governments aiming to entrench their own power.

We propose an alternative: platform democracy. In the past decade, a new suite of democratic processes have been shown to be surprisingly effective at navigating challenging and controversial issues, from nuclear power policy in South Korea to abortion in Ireland. These processes have been tested around the world, overcome the pitfalls of elections and referendums, and can work at platform scale. They enable the creation of independent ‘people’s mandates’ for platform policies—something invaluable for the impacted populations, the governments which are constitutionally unable to act on speech, and even the platforms themselves.

Key quote.

The Challenge: Who decides? (On divisive platform policies)

  • Complex policy issues: Online platforms must make policy decisions around controversial issues such as content moderation, political advertising, recommendations, and privacy.
  • Deciders often compromised: Currently, either platform CEOs (and their teams) ultimately determine platform policy or powerful governments do; often neither is rewarded by serving the public.
  • Negligible public mandate: The public is continually impacted by these decisions and cares about their downstream outcomes (e.g. censorship, misinformation, violence, surveillance), but their perspectives are rarely incorporated (beyond one-sided studies).
  • Platforms are stuck: Even platform CEOs often don’t want to be held responsible for these decisions—there may be no action that ’looks good’ or which can forestall retaliation from partisan politicians or governments.
  • No obvious alternative: Even within functional democracies, governments are often limited constitutionally or by partisan gridlock. Platform-based referendums have been attempted, but had negligible response rates.

The Context: New democratic mechanisms have handled tough issues at national scale.

  • New democratic decision-making processes have now been shown to make thoughtful decisions and be broadly trusted, without most of the damaging political dynamics of referendums and elections, and for a tiny fraction of the cost.
  • When designed well, these processes can work even when no existing powerful actor is trustworthy and when no one wants to be held responsible for a decision.
  • They often involve creating a demographically representative “mini-public” that is compensated for a fixed time period to learn about an issue from the many multi-stakeholder perspectives, deliberate together, and voice their conclusions.
  • This may seem idealistic and implausible. But these new “representative deliberation processes” have now beenused to support complex policy-making around the world, tackling issues from abortion in Ireland to nuclear power in South Korea.

The Opportunity: Platforms can use these processes to tackle controversial issues.

  • Platforms working with governments, civil society, can have experienced and neutral facilitators deploy these new processes for the toughest policy questions.
  • Policy decisions will then be made by the impacted populations and informed by key stakeholders, often leading to a strong public mandate (which may even help defend against partisan or authoritarian overreach).

4 Incoming

Figure 2: Via Oglaf

5 References

Andrus, Dean, Gilbert, et al. 2021. AI Development for the Public Interest: From Abstraction Traps to Sociotechnical Risks.”
Behr, Reding, Edwards, et al. 2013. Radicalisation in the Digital Era: The Use of the Internet in 15 Cases of Terrorism and Extremism.”
Bernheim, and Rangel. 2009. Beyond Revealed Preference: Choice-Theoretic Foundations for Behavioral Welfare Economics*.” The Quarterly Journal of Economics.
Dean, and Morgenstern. 2022. Preference Dynamics Under Personalized Recommendations.”
Edelmann. 2022. Values, Preferences, Meaningful Choice.”
Hamid, and Ariza. n.d. Offline Versus Online Radicalisation: Which Is the Bigger Threat?
Hassan, Brouillette-Alarie, Alava, et al. 2018. Exposure to Extremist Online Content Could Lead to Violent Radicalization:A Systematic Review of Empirical Evidence.” International Journal of Developmental Science.
Hron, Krauth, Jordan, et al. 2022. Modeling Content Creator Incentives on Algorithm-Curated Platforms.”
Lambert, Barnstable, Minter, et al. 2022. Taking a One-Week Break from Social Media Improves Well-Being, Depression, and Anxiety: A Randomized Controlled Trial.” Cyberpsychology, Behavior, and Social Networking.
Mannell, and Smith. 2022. Alternative Social Media and the Complexities of a More Participatory Culture: A View From Scuttlebutt.” Social Media + Society.
Ottman, Davis, Ottman, et al. 2022. The Censorship Effect.”
Paasonen. 2021. Dependent, Distracted, Bored: Affective Formations in Networked Media.
TeBlunthuis, Kiene, Brown, et al. 2022. No Community Can Do Everything: Why People Participate in Similar Online Communities.” Proc. ACM Hum.-Comput. Interact.
Xu, and Dean. 2023. Decision-Aid or Controller? Steering Human Decision Makers with Algorithms.”