âIn the past, the KGB couldnât follow everyone. They had to choose their targets. AI doesnât have to choose. It can follow everyone, all the time, automatically. The bottleneck of totalitarianism was human attention. AI removes that bottleneck.â â Nexus, Chapter 10
George Orwell imagined a totalitarian future of telescreens, thought police, and constant surveillance. He couldnât imagine AI. The surveillance state that AI enables goes far beyond Orwellâs nightmaresânot because itâs more brutal, but because itâs more thorough, more automated, and more inescapable.
Harari argues that AI may solve the fundamental problem that limited previous totalitarian regimes: the inability to process enough information to truly control a society.
Stalinâs Dilemma: To control everything, you need to know everything. But gathering and processing that much information exceeded human capacity.
AIâs Answer: Machine learning can process billions of data points, identify patterns, and flag anomaliesâautomatically, continuously, at scale.
What was impossible for human bureaucracies is trivial for AI systems.
Modern surveillance doesnât require informants or secret police listening at keyholes. Itâs built into the infrastructure of daily life:
The data already exists. AI makes it usable for control.
China is implementing a âsocial creditâ system that tracks citizen behavior and assigns scores affecting access to jobs, travel, loans, and social services. The system aggregates data from surveillance cameras, financial records, social media, and government databases.
This isnât science fictionâitâs operational, and itâs being refined with AI to become more comprehensive and automated.
The most chilling application of AI authoritarianism isnât punishing dissentâitâs predicting and preventing it before it happens. AI systems can analyze patterns to identify potential troublemakers, nascent movements, or brewing discontent.
This flips the traditional model: instead of reacting to opposition, the state can preemptively neutralize it.
Traditional: Dissent occurs â State detects â State responds
AI-Enabled: AI predicts dissent risk â State intervenes â Dissent never occurs
If the state can identify and âtreatâ potential dissidents before they act, organized opposition becomes nearly impossible.
Human agents of repressionâsecret police, informants, censorsâhave limits. They require salaries, they can be corrupted, they might have moral qualms. AI has none of these limitations.
Automated systems can:
Harari examines Chinaâs treatment of the Uyghur population in Xinjiang as a case study of AI-enabled authoritarianism. The region has become a laboratory for surveillance technology:
This represents a new form of totalitarian controlâmore targeted, more data-driven, more automated than anything before.
The surveillance technologies developed in Xinjiang are being exported. Chinese companies sell facial recognition, smart city infrastructure, and monitoring systems to governments around the worldâfrom democracies to dictatorships.
The tools of AI authoritarianism are becoming globally available.
Harari asks a disturbing question: Could AI-enabled totalitarianism actually work? Previous totalitarian states failed partly because centralized control couldnât process enough information. If AI solves the information problem, might such regimes become stable?
He remains skeptical. Even with perfect surveillance, totalitarian systems still face the self-correction problem from Chapter 4. If no one can tell the leader theyâre wrong, errors accumulate. AI might improve surveillance without improving decision-making.
Philosopher Jeremy Bentham imagined a âpanopticonââa prison where guards could see all prisoners but prisoners couldnât see the guards. The possibility of being watched would produce self-discipline.
AI surveillance creates a digital panopticon. Citizens know they might be monitored at any moment. The result is anticipatory self-censorshipâpeople police themselves, conforming to expected behavior without explicit commands.
Democracies are not immune. The same surveillance technologies available to authoritarian states are available to democratic ones. The difference is supposed to be legal constraints and civil society oversightâbut these protections are under pressure.
The post-9/11 expansion of surveillance in democracies shows how quickly civil liberties can erode when security concerns are invoked. AI could accelerate this erosion.