âThe old gatekeepers were flawed. They had biases, blind spots, and conflicts of interest. But the alternative isnât no gatekeepersâitâs AI gatekeepers optimizing for engagement rather than truth.â â Nexus, Chapter 9
Democracy depends on conversationâcitizens exchanging views, debating evidence, reaching compromises. This conversation requires certain conditions: shared facts, good faith, common language, and spaces for deliberation. AI and social media are undermining all of these conditions.
Harari doesnât argue that democracy is dead, but that its information infrastructure is under unprecedented stress.
Shared Facts â Parallel Realities: Different groups now consume entirely different information
Good Faith â Tribal Loyalty: Changing your mind based on evidence is seen as betrayal
Common Language â Loaded Terms: The same words mean different things to different groups
Deliberation â Outrage: Algorithms reward emotional reaction, not thoughtful engagement
Recommendation algorithms show you content similar to what youâve engaged with before. This creates âfilter bubblesââinformation environments tailored to your existing preferences and beliefs. You see a version of reality that confirms what you already think.
The result: people in the same society, living in the same city, can inhabit completely different information universes.
The 2016 US election and Brexit referendum revealed how fragmented information environments had become. Supporters of different sides werenât just reaching different conclusions from the same factsâthey were operating with entirely different facts.
Post-election analysis showed that many viral stories were simply falseâbut they spread because they confirmed what people wanted to believe.
Social media platforms optimize for engagementâkeeping you on the platform as long as possible. Research consistently shows that emotional content, especially outrage, drives engagement. So algorithms promote content that makes you angry, afraid, or indignant.
This is not a conspiracy; itâs just optimization. But the effect is to poison democratic discourse with constant emotional manipulation.
Outrage: Content that triggers moral indignation spreads fastest
Simplification: Nuanced arguments lose to punchy slogans
Tribalism: Us-vs-them framing outperforms bridge-building
Novelty: New scandals beat slow-developing stories
Confirmation: Information that confirms existing beliefs feels more satisfying
Traditional information intermediariesânewspapers, broadcasters, expertsâserved as filters and validators. They werenât perfect, but they provided some quality control. Social media bypassed these gatekeepers, promising democratization of information.
The result has been not democratization but chaos. Without trusted intermediaries, every claim is equally valid (or equally suspect). Expertise becomes just another opinion.
AI can now generate realistic fake videos, audio, and images. This technology will make it increasingly difficult to distinguish authentic content from fabrication. The implications for democratic discourse are severe:
Large language models can generate vast quantities of plausible-sounding text. This enables information warfare at unprecedented scale. A state actor or well-resourced group could flood the information environment with AI-generated contentânot necessarily to convince anyone of anything specific, but to create so much noise that signal becomes impossible to find.
Harari doesnât offer easy solutions, but he identifies several approaches being tried:
None of these is sufficient alone; all face implementation challenges.
In the worst case, AI-powered information manipulation makes democratic conversation impossible. Citizens retreat into tribal information bubbles. Elections become contests of mobilization and manipulation rather than persuasion. Democratic forms persist but democratic substance dies.
This isnât inevitableâbut itâs not impossible either.
Democratic deliberation is slow. It requires reading, thinking, discussing, and compromisingâall time-consuming activities. AI operates at machine speed. Misinformation can spread globally before fact-checkers finish their coffee.
This creates a structural disadvantage for truth and deliberation in the attention economy.