Policy Briefs and Reports Books Journals

Policy Brief  No.263

Reclaiming Attention: From Digital Conflict to Democratic Dialogue

Jordan Ryan

January 08, 2026

This policy brief poses the question: which human capacities does digital polarisation erode, and why does their erosion matter for democratic life? It references a comprehensive governance architecture developed by Toda Peace Institute, Lisa Schirch’s 'Blueprint for Prosocial Tech Design Governance and Social Media Impacts on Conflict and Democracy', which addresses the dynamics of digital polarisation threatening democratic governance and establishes policy frameworks, regulatory mechanisms, and design interventions for constraining platform harms. Drawing on Simone Weil’s analysis of attention, affliction, and uprootedness, the brief offers a theory of democratic capacity that clarifies what platform governance must protect and concludes with four policy actions that ground platform accountability in the democratic capacities it must preserve

 

Contents

Abstract

Digital polarisation threatens democratic governance faster than existing regulatory and peacebuilding frameworks can respond. The Toda Peace Institute has developed a comprehensive governance architecture to address these dynamics, notably through Lisa Schirch’s Blueprint for Prosocial Tech Design Governance and Social Media Impacts on Conflict and Democracy, which establish policy frameworks, regulatory mechanisms, and design interventions for constraining platform harms. This brief addresses a prior question: which human capacities does digital polarisation erode, and why does their erosion matter for democratic life?

Drawing on Simone Weil’s analysis of attention, affliction, and uprootedness, the brief offers a theory of democratic capacity that clarifies what platform governance must protect. It connects this framework to Build Up’s polarisation footprint methodology, demonstrating how the erosion of democratic capacity can be measured and governed. The brief concludes with four policy actions that ground platform accountability in the democratic capacities it must preserve, with direct implications for national regulators, multilateral institutions, and peacebuilding actors operating in polarised contexts.

1. The governance gap: From platform design to democratic capacity

Digital polarisation now operates at a scale and speed that existing governance systems were not designed to manage. The contemporary digital environment does not merely reflect social divisions; it actively amplifies and engineers them. Engagement-driven ranking systems, profit-based outrage incentives, and opaque moderation practices intensify grievance, fuel identity-based hostility, and accelerate the erosion of public trust.

The Toda Peace Institute’s existing work has mapped this terrain. Lisa Schirch’s Social Media Impacts on Conflict and Democracy documents how platform dynamics drive what she terms “social climate change” across thirteen country contexts, while her Blueprint for Prosocial Tech Design Governance provides a practical framework for regulatory intervention, design standards, and civil-society action.[1] Together, these contributions establish the governance architecture: the mechanisms through which states, platforms, and civil society can constrain harm and incentivise prosocial design.

Yet governance frameworks presuppose an account of what they protect. Platform regulation aims to preserve something, but what, precisely? The answer implicit in most policy debates is ‘healthy information environments’ or ‘democratic discourse’. These formulations are not wrong, but they remain underspecified. They do not explain why polarisation matters at the level of democratic capacity, or why it threatens not merely the quality of debate but the human faculties on which democratic participation depends.

This brief addresses that gap. It argues that existing governance frameworks possess important tools but remain incomplete unless they make explicit what they are meant to safeguard: the civic capacities that allow citizens to deliberate, participate, and accept outcomes as legitimate. Grounding platform governance in those capacities makes regulation more precise, more defensible, and more relevant to prevention.

2. Simone Weil and the Theory of Democratic Capacity

The idea that democracy depends on specific capacities is well established. UNDP and other multilateral institutions have long analysed the institutional, social, and civic conditions that enable democratic governance, including participation, trust, accountability, and inclusion. Using Weil’s framework does not replace this body of work. It complements it by shifting attention to the human faculties that make those conditions operable in practice: the capacity to attend to complexity, to act with agency rather than fear, and to experience belonging within a shared political world. These capacities become especially salient in a digitalised public sphere, where platform design increasingly shapes cognition, voice, and collective meaning.

Simone Weil was a French philosopher and political thinker writing amid Europe’s descent into authoritarianism and totalitarian rule. Her concern was not only the consolidation of power in authoritarian states, but the quieter destruction of the human capacities that make self-government possible. Totalitarian systems, she argued, succeed not merely by imposing ideology; they succeed by eroding attention, stripping agency, and uprooting individuals from shared frameworks of meaning, leaving people unable to deliberate, resist, or act together.[2]

Today’s digital systems are not totalitarian regimes. Yet they reproduce, at scale and through different means, many of the same human consequences Weil identified. Algorithmic amplification, engagement-driven design, and platform-mediated harassment do not require authoritarian intent to generate authoritarian effects. They weaken attention, produce forms of affliction that silence participation, and fragment shared reality, creating conditions in which democratic self-government becomes increasingly difficult to sustain.

Weil’s analysis identifies three interrelated human faculties as foundational to democratic life: attention, agency, and belonging. She argued that totalitarian systems succeed not merely by imposing ideology, but by systematically undermining these capacities. The following paragraphs examine each of these faculties and how digital dynamics erode them.

First, attention, for Weil, was a moral and civic discipline: the capacity to suspend immediate reaction in order to receive reality as it is, including the reality of others. Democratic life depends on this capacity. Without attention, citizens cannot hold complexity, recognise the humanity of those with whom they disagree, or deliberate rather than merely react. Contemporary digital platforms systematically erode attention through infinite scroll, autoplay, and constant notifications that eliminate pause and reflection.[3] For regulators, the failure to address this erosion of attention constitutes not just a cultural concern but a systemic governance gap.

Second, democratic life requires agency: the capacity for individuals to act with purpose and voice in the public sphere. When people can participate without fear of retribution, they can contribute to collective decision-making and hold power to account. Weil used the term “affliction” to describe the condition in which suffering, particularly from social and political sources, destroys a person’s social standing and silences their voice. In digital environments, coordinated harassment, algorithmic amplification of abuse, and disinformation campaigns function as modern mechanisms of affliction.[4] [5] They do not merely cause distress; they systematically destroy political agency, driving journalists, activists, and other citizens from public life. For policymakers, affliction names a specific democratic harm: the systematic destruction of standing and voice through sustained attack.

Third, democratic participation depends on belonging: a sense of rootedness in shared frameworks of meaning and community. This allows for trust and solidarity, even amidst disagreement. Weil warned of “uprootedness” as a precondition for authoritarian domination, arguing that people who have lost their roots become available for mobilisation by forces that offer false belonging through exclusion and hostility. Algorithmic personalisation and engagement-driven business models intensify this dynamic by fragmenting information environments and producing parallel realities. The result is not simply misinformation, but the erosion of the shared ground on which democratic negotiation depends. For peacebuilders, this represents a structural driver of conflict that prevention frameworks must now incorporate.

Taken together, Weil’s framework provides what governance architectures require but do not themselves supply: a clear account of the human capacities that platform regulation must protect if democratic self-government is to remain viable.

3. From theory to measurement: The polarisation footprint

Effective governance requires measurement. Without credible tools, the erosion of democratic capacity remains difficult to compare across platforms, track over time, or integrate into regulatory and prevention decisions. Platforms can dismiss harms as subjective; regulators lack comparable indicators; peacebuilding actors struggle to detect escalation until damage is well advanced.

In response, the peacebuilding organisation Build Up has developed the polarisation footprint, a methodology that makes affective polarisation visible, measurable, and governable by treating it as a negative externality of platform design rather than an incidental by-product of user behaviour.[6] This framing shifts responsibility from individuals to systems and opens space for governance responses that address root causes.
The polarisation footprint disaggregates harm across three dimensions: attitude polarisation (hostility toward perceived out-groups), norm polarisation (the normalisation of abusive behaviour), and interaction polarisation (fragmentation into insulated conversational clusters). These dimensions map directly onto Weil’s failure modes: attention erosion, affliction, and uprootedness. The alignment is not coincidental: Weil’s framework explains why these harms matter for democratic life, while the polarisation footprint provides the means to detect and measure them.

Crucially, the methodology complements existing regulatory tools. Under the European Union’s Digital Services Act, very large online platforms are required to identify and mitigate systemic risks to civic discourse.[7] The polarisation footprint provides a practical means of operationalising those obligations by translating abstract risks into comparable indicators. For donors and multilaterals, the methodology offers a basis for monitoring, conditionality, and early warning.

Pilot applications demonstrate both the methodology’s practical value and its connection to Weil’s framework. During Kenya’s 2022 election period, Build Up found that approximately 23 per cent of political content on X contained dehumanising language, compared with significantly lower levels on Facebook and WhatsApp. These differences reflect platform architecture, moderation practices, and incentive structures. In Weil’s terms, the dehumanising language captured here is not simply ‘toxicity’; it is an observable indicator of attention failure and incipient affliction—evidence that democratic capacities are being actively eroded.

4. Policy actions

The following actions build on Toda’s governance architecture by grounding platform accountability in the democratic capacities it must protect. Where existing recommendations focus on what platforms should do, Weil’s framework clarifies what is at stake if they fail: not merely degraded discourse, but the erosion of the human faculties on which self-government depends.

4.1 Mandate Systemic Democratic Capacity Risk Assessment
National communications authorities and EU regulators should require large platforms to assess and disclose systemic risks to democratic capacity, including attention erosion, affliction, and fragmentation of shared information environments. Measurement tools such as the polarisation footprint should be incorporated into existing systemic-risk processes. By making attention erosion visible as a governance risk, regulators can act before manipulation and domination take hold.

4.2 Protect Agency: Establish Remedies for Coordinated Digital Harassment
Coordinated harassment campaigns constitute a mechanism for producing affliction. Legal frameworks should distinguish between isolated speech and organised patterns of abuse intended to silence participation.
Drawing on ‘course of conduct’ concepts in harassment law—which address patterns of behaviour rather than isolated incidents—remedies should address cumulative harm while safeguarding legitimate expression. The aim is preventing the erosion of agency that makes populations governable through intimidation rather than persuasion.

4.3 Restore Belonging: Invest in Public-Interest Digital Infrastructure
Constraining harm is necessary but insufficient. Democratic societies must invest in digital environments designed to support shared meaning rather than extract engagement. UNDP, bilateral donors, multilateral development banks, and the UN Peacebuilding Fund should support deliberative platforms, interoperability standards, and community-governed moderation systems. By strengthening shared spaces, such investments counter the uprootedness that makes exclusionary and authoritarian narratives effective.

4.4 Integrate Democratic Capacity into Peacebuilding Practice
Peacebuilding institutions, including UNDP, DPPA, PBSO, and regional organisations, should integrate democratic-capacity indicators into conflict analysis, early warning, and prevention strategies. In practice, this means incorporating indicators of attention erosion, affliction, and uprootedness into analytical frameworks used by UN country teams and bilateral donors, and training mediators and facilitators to understand how platform dynamics shape conflicts. Experience from Myanmar, Ethiopia, and Sri Lanka demonstrates that digital dynamics operate as conflict accelerants that existing systems often fail to detect until harm is advanced.
Treating digital polarisation as a core peacebuilding concern enables earlier intervention and more accurate risk assessment.

5. Implications for democratic backsliding and prevention

Democratic erosion today often occurs before institutions collapse. Digital systems increasingly shape political behaviour, legitimacy, and mobilisation, weakening democratic capacity in ways that make societies more governable through fear, confusion, and fragmentation. These dynamics create space for authoritarian actors without requiring authoritarian systems.

A capacity-based approach reframes prevention. Rather than waiting for overt repression, it focuses on early indicators: attention erosion, affliction, and uprootedness. Protecting democratic capacity is therefore not a normative preference but a governance necessity. It allows institutions to act before disagreement hardens into domination.

Conclusion

Digital polarisation is not an accidental by-product of technological innovation. It is the foreseeable outcome of governance systems that have allowed concentrated digital power to reshape political life without corresponding public standards. Reclaiming attention is not a cultural aspiration but a governance imperative. By grounding platform accountability in democratic capacity, and by connecting that theory to credible measurement, democratic societies can begin to manage polarisation at scale, before democratic erosion becomes irreversible.

Notes

[1]Lisa Schirch, Social Media Impacts on Conflict and Democracy: The Techtonic Shift (London: Routledge, 2021); Lisa Schirch, Blueprint for Prosocial Tech Design Governance (Tokyo: Toda Peace Institute, 2025).

[2]Simone Weil, Waiting for God, trans. Emma Craufurd (London: Routledge and Kegan Paul, 1951); Simone Weil, The Need for Roots, trans. Arthur Wills (London: Routledge and Kegan Paul, 1952).

[3]Philipp Lorenz-Spreen et al., ‘Accelerating Dynamics of Collective Attention,’ Nature Communications 10, no. 1 (2019): 1759.

[4]United Nations Human Rights Council, Report of the Independent International Fact-Finding Mission on Myanmar, A/HRC/39/64 (Geneva: United Nations, 12 September 2018).

[5]Amnesty International, Toxic Twitter: A Toxic Place for Women (London: Amnesty International, 2018).

[6]Build Up, ‘The Polarization Footprint,’ 3-part series, November 2025, https://howtobuildup.medium.com/the-polarization-footprint-part-1-8a85fdaf35b1. For the Kenya analysis, see part 2 of the series.

[7]European Union, Regulation (EU) 2022/2065 on a Single Market for Digital Services (Digital Services Act), Official Journal of the European Union L 277 (27 October 2022), Articles 34–35.


The Author

JORDAN RYAN


JORDAN RYAN

Jordan Ryan is a member of the International Research Advisory Council of the Toda Peace Institute. He previously served as Assistant Secretary-General of the United Nations and as Vice President for Peace Programs at The Carter Center. His work focuses on peacebuilding, democratic resilience, and the governance of digital power in fragile and divided societies..

Toda Peace Institute

The Toda Peace Institute is an independent, nonpartisan institute committed to advancing a more just and peaceful world through policy-oriented peace research and practice. The Institute commissions evidence-based research, convenes multi-track and multi-disciplinary problem-solving workshops and seminars, and promotes dialogue across ethnic, cultural, religious and political divides. It catalyses practical, policy-oriented conversations between theoretical experts, practitioners, policymakers and civil society leaders in order to discern innovative and creative solutions to the major problems confronting the world in the twenty-first century (see www.toda.org for more information).

Contact Us

Toda Peace Institute
Samon Eleven Bldg. 5thFloor
3-1 Samon-cho, Shinjuku-ku, Tokyo 160-0017, Japan

Email: contact@toda.org