In China, ubiquitous cameras surveil restive minorities. In the US, algorithms determine whether people get locked up. 


Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control.


The potential of AI surveillance is the subject of the third installment of the Sleepwalkers podcast. The episode examines how AI consolidates power and control, and asks if we can limit this troubling trend.


Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.


An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution.


“Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”


No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms.


In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. China is now exporting the technology, along with the principles of techno-repression, to countries including Pakistan, Cambodia, and Laos, through its Belt and Road Initiative.


Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.


This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US.


Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I'm still haunted by COMPAS,” Rodriguez warns.


Law enforcement is embracing AI. The episode concludes with the New York Police Department testing technologies including facial recognition. And although AI promises to make the department more effective and even more accountable, whether we accept this troubling trend may determine whether the West sleepwalks towards its own form of technological tyranny.


“In America, the liberty we take for granted is hard-won and fragile,” says Oz Woloshyn, the host of Sleepwalkers. “So much hangs in the balance, and the decisions we take will affect our lives profoundly, and echo through the lives of our children.”