Back
The Authoritarian Risks of AI Surveillance
webCredibility Rating
4/5
High(4)High quality. Established institution or organization with editorial oversight and accountability.
Rating inherited from publication venue: Lawfare
Published on Lawfare, a national security and law-focused outlet, this piece is relevant to AI governance discussions about dual-use risks and the geopolitical dimensions of AI deployment, particularly for researchers studying how AI could enable large-scale societal control.
Metadata
Importance: 55/100opinion pieceanalysis
Summary
This Lawfare article examines how AI-powered surveillance technologies can be exploited by authoritarian regimes to monitor, control, and suppress populations. It explores the political and governance risks posed by the proliferation of AI surveillance tools, both domestically and through export to repressive governments.
Key Points
- •AI surveillance tools (facial recognition, predictive policing, social scoring) dramatically amplify state capacity for population control and repression.
- •Authoritarian governments can use AI surveillance to target dissidents, minorities, and political opponents with unprecedented precision and scale.
- •Export of AI surveillance technology by democratic nations to authoritarian regimes raises serious human rights and geopolitical concerns.
- •Weak international governance frameworks currently allow widespread proliferation of surveillance AI with limited accountability.
- •Domestic misuse of surveillance AI in democracies risks gradual erosion of civil liberties and normalization of authoritarian practices.
Cited by 2 pages
| Page | Type | Quality |
|---|---|---|
| AI-Enabled Authoritarian Takeover | Risk | 61.0 |
| AI Authoritarian Tools | Risk | 91.0 |
Cached Content Preview
HTTP 200Fetched Mar 20, 202625 KB
- 
## [Matthew Tokson](https://www.lawfaremedia.org/contributors/mtokson)
[mtokson](https://twitter.com/mtokson)
[@mtokson.bsky.social](https://bsky.app/profile/@mtokson.bsky.social)
* * *
[Meet The Authors](https://www.lawfaremedia.org/article/the-authoritarian-risks-of-ai-surveillance#postContributors)
[Subscribe to Lawfare](https://www.lawfaremedia.org/subscribe)
Concerns about authoritarianism [loom](https://www.nytimes.com/2025/03/20/world/europe/trump-courts-defiance-autocrats-playbook.html) [large](https://www.lawfaremedia.org/article/the-situation--the-five-pillars-of-trumpian-repression) in American politics. Against this backdrop, another phenomenon may be pushing democracies toward authoritarianism: artificial intelligence (AI) law enforcement. AI surveillance and policing systems are currently used by authoritarian nations around the world. Evidence suggests that [these systems are effective](https://economics.mit.edu/sites/default/files/inline-files/aitocracy_QJE.pdf) in suppressing political unrest and entrenching existing regimes. Concerningly, AI surveillance and policing systems have also become increasingly prevalent in cities across the United States.
As I explain [in a new article](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5182213), AI law enforcement tends to undermine democratic government, promote authoritarian drift, and entrench existing authoritarian regimes. AI-based systems can reduce structural checks on executive authority and concentrate power among fewer and fewer people. In the wrong hands, they can help authorities detect subversive behavior and discourage or punish dissent, while enabling corruption, selective enforcement, and other abuses. These effects are already visible in today’s relatively primitive AI systems, and they’ll become increasingly dangerous to democracy as AI technology improves.
**AI Law Enforcement from China to the U.S.**
To get a sense of the capabilities of AI law enforcement, look to present-day China. [Analysts estimate](https://www.nytimes.com/2022/06/21/world/asia/china-surveillance-investigation.html) that over half of the world’s surveillance cameras are in China, and many of those cameras use AI facial recognition. AI algorithms identify people and track their movements, allowing the government to monitor their activities and their meetings with others. Iris scans act as a visual fingerprint of people, even those wearing masks. Spy drones fly above China’s cities, recording activities in ever-sharper detail. AI analytics can spot unlawful or anomalous actions, even littering. In recent years, Chinese authorities have installed facial recognition cameras inside residential buildings, hotels, and even karaoke bars. The [goal of installing these systems](https://www.nytimes.com/2022/06/21/world/asia/china-surveillance-invest
... (truncated, 25 KB total)Resource ID:
ae842d471373d0fb | Stable ID: MzAyODdkMD