The Rise — And Rise — Of Mass Surveillance

Eavesdropping bureaucrats have been replaced by algorithm-driven facial recognition technology. But the real impact of indiscriminate surveillance may be in our minds.

We live in a world where school cameras monitor children’s emotions, countries collect people’s DNA en masse, and no digital communication seems truly private.

In response, we use encrypted chat apps on our phones, wear masks during protests to combat facial recognition technology, and try vainly to hide our most personal information from advertisers.

Welcome to the new reality of mass surveillance. How did we get here?

Wael Eskandar, an Egyptian journalist and technologist, remembers documenting his country’s revolution at Cairo’s Tahrir Square in 2011. It was known then, he remembers, that people’s phone calls were being monitored, and that workers like parking lot attendants and security guards were feeding information back to the police. But few suspected emails or posts on Twitter and Facebook would ever be monitored in the same way — at least not at scale.

The revolution toppled the brutal regime of longtime dictator Hosni Mubarak, but by 2014 the country was under the sway of the equally repressive President Abdel Fattah el-Sisi. Now, Egyptians are being arrested for political posts they made on Facebook, and some have reported having their texts read back to them by police during detention. Demonstrations all but stopped.

In 2019, rare protests did take place in Egypt over government corruption. Demonstrators avoided posting about them on social media, wary of ending up in detention, but ultimately it didn’t matter — dozens of people were rounded up anyway.

“It’s like there’s no space left for us to speak anymore,” one woman who had participated in the demonstrations told me earlier this year.

Egypt and dozens of other authoritarian states have increasingly employed mass surveillance technology over the past decade. Where human monitors once had to listen in to phone calls, now increasingly sophisticated voice recognition software can do that at scale, and algorithms scour social media messages for signs of dissent. Biometric surveillance systems like facial and behavioral recognition also make it easier for security services to target large swathes of their population.

But mass surveillance is not just the domain of repressive regimes. Companies are using their own forms of surveillance — data collection to target consumers with ads, and biometric screenings to watch their moods and behaviors. In 2012, the New York Times reported Target had figured out a teenager’s pregnancy before her father; now it’s using Bluetooth to track your movements as you wander its store aisles. Five years ago, the US Federal Trade Commission called on Congress to regulate data brokers, saying consumers had a right to know what information they had on them. In 2019, these companies remain largely unregulated and hold reams of information about individuals, almost none of which is known to the public.

Powering these surveillance systems is an increasingly complex web of personal data. In 2009, that data might have included your neighborhood and purchasing history. Now it’s likely that your most personal qualities — from your facial features to your search results — will be slurped up too. Cross-referencing seemingly inconsequential data from different sources helps companies build detailed and powerful profiles of individuals.

Surveillance systems are being built by some of the world’s biggest technology companies, including US tech giants Amazon, Palantir, and Microsoft. In China, companies like SenseTime, Alibaba, and Hikvision — the world’s largest maker of surveillance cameras — are moving quickly to corner foreign markets from the Middle East to Latin America. And other players like Israel’s NSO Group are making it easy for governments all over the world to break into the devices of journalists and dissidents.

This all-seeing surveillance seems straight out of the dystopian fiction of George Orwell’s 1984 or Aldous Huxley’s Brave New World. But centuries earlier, novelists had imagined surveillance as a cornerstone of utopian societies. As far back as 1771, the French novelist Louis Sébastien Mercier depicted a futuristic society exemplifying the rational values of the Enlightenment in a hit novel called L’an 2440. This imagined social order was enforced by a cadre of secret police.

For most of modern history, mass surveillance, when it has been implemented, has been laborious and expensive. The Stasi, infamous for spying on the most mundane aspects of East Germans’ lives, relied on massive networks of informers and on bureaucrats picking through letters and listening in on phone calls. A friend who grew up in Dresden before the fall of the Berlin Wall once told me she remembered being asked by her kindergarten teacher whether her parents were watching West German TV.

Without this level of human participation, these systems would simply not work. They might function well enough for governments who wanted to monitor individual troublemakers, but when it came to quashing dissent altogether, it was a lot tougher.

In less developed parts of the world, such as Nicaragua and North Korea, state surveillance still works this way. But in richer countries — ranging from democratic societies like the US and the UK to authoritarian ones like China — the burden of conducting surveillance has shifted from humans to algorithms.

It’s made surveillance in these places far more efficient for both governments and companies, and as the technology improves and becomes more widespread, it’s only a matter of time before the rest of the world adopts similar techniques.

In 2012, I wrote an op-ed with the author and journalist Peter Maass arguing that we should think of cellphones as “trackers” instead of devices to make calls with. That idea now seems quaint — of course cellphones and the apps we download to them are monitoring our activities. We published the article not knowing that less than a year later, a 29-year-old former NSA contractor named Edward Snowden would leak an unprecedented cache of documents showing some of the true scope of the mass surveillance programs in the US.

Snowden’s leaked documents revealed, among many other things, that the NSA was collecting phone records from millions of Verizon customers, and that it had accessed data from Google and Facebook through back doors. In Germany, the intelligence service was also listening in on millions of phone calls and reading emails and text messages in a surveillance program often compared to that of the Stasi.

By the time Snowden vaulted to fame, hiding out in a hotel in Hong Kong, I had left the US too. I arrived in Beijing to begin work as a journalist for Reuters in late 2012, and fully expected to be the object of some government snooping. After all, there are only a few hundred foreign journalists based in China — a country of more than a billion people — and the things they write are closely scrutinized because of their ability to shape the world’s view of China.

At the time, a constant subject of debate among junior reporters over kebabs and beer was whether the government was really keeping an eye on our communications, or if we were too small potatoes to matter. I often joked with an old boyfriend, an American who worked in foreign policy, that somewhere an unlucky state security intern was monitoring our cutesy volley of GIFs and emojis. We imagined our eavesdroppers as disheveled bureaucrats, not as lines of code.

One year, a Chinese police official pointedly commented that my apartment looked cheap and untidy — it was a way to let me know he’d seen the inside of it. On other occasions, police arrived at my door supposedly to check if my water heater was up to standard, but spent more time eyeing the contents of my bookshelf and asking about my work. My colleagues, like the Financial Times’ Yuan Yang, have had private messages on WeChat — the ubiquitous Chinese social app made by tech giant Tencent — quoted back to them by government officials.

At my annual China visa renewal: Police officer: I saw you posted on social media about organising an event for journalists on the 8th Me: I don't think I did... Me: *thinks, does he realise he saw that by surveilling my private messages and not on my public feed*

But by and large, none of us ever found out definitively whether our flats were bugged, our emails read, our phones monitored. We just acted as if they were.

Snowden was all over the state-run news in China — the story of an American dissident outing the US surveillance system was far too juicy to pass up. To this day, Chinese officials sometimes bring up Snowden and what he revealed about America’s surveillance program in response to questions about the Chinese nanny state.

At that time, surveillance seemed like an invisible web — something everyone knew was a problem, but was tough to actually see.

What I never predicted was the expansion of surveillance technology into a form so visible and widespread that it became as much a part of the atmosphere of China as Beijing’s infamous smog. Facial recognition cameras, for instance, are now ubiquitous in the country after first appearing in the western region of Xinjiang, where more than a million Uighurs, Kazakhs, and other Muslim ethnic minorities are now in internment camps. The region has become the global epicenter for high-tech surveillance, which the Chinese government has combined with heavy-handed human policing including officers asking dozens of highly personal questions to individuals and plugging their responses into a database. There, police collect data at people’s homes, police stations and roadside interrogations to feed into a centralized system called the Integrated Joint Operations Platform, which spits out determinations for whether Muslim citizens should be interned or not.

It is the first example of a government using 21st-century surveillance technology to target people based on race and religion in order to send them to internment camps, where they face torture and other horrific abuses. According to some estimates, it is the largest internment of ethnic minorities since World War 2.

The collection of such data for security purposes is often called “predictive policing,” a technique used in many countries, including the US, to spot the potential for individual criminal behavior in data.

When I visited Kashgar, a city in southern Xinjiang, in the fall of 2017, it felt like catching an uncanny glimpse of a suffocating future — one where DNA collection was mandatory and even filling your car with gas required a scan of your iris.

Since then, much of the technology being used in Xinjiang has been sold to other parts of the world. Companies and the governments that contract with them point to the many benign uses of some surveillance tech — security, public health, and more. But there are few places in the world where people have been asked to consent to surveillance tech being used on them. In the US, facial recognition technology is already widely used, and only a handful of cities have moved to ban it — and then, only its use by government authorities. Campaigners against mass surveillance systems say it’s tough to convince people these technologies are genuinely harmful — especially in places where public security or terrorism are serious problems. After all, digital monitoring is usually invisible and security cameras seem harmless.

“I don’t think people are happy about tech or positive about tech for the sake of it, but they don’t know the extent to which that can go wrong,” said Leandro Ucciferri, a lawyer specializing in technology and human rights at the Association for Civil Rights in Argentina. “People don’t usually have the whole picture.”

When, in the course of reporting, I peered at the back ends of surveillance systems that claimed to track individuals by their clothing, their faces, their walks, and their behavior, I wondered how I could continue to do my work in the same way. Could I go out to meet a source for coffee without immediately outing her in front of a camera whose video streams were being parsed by an algorithm?

“The tech developments themselves have enabled the Chinese government to implement its vision,” said Maya Wang, senior China researcher at Human Rights Watch and one of the leading authorities on mass surveillance in Xinjiang. “That's why we see the rise of the total surveillance state — because it's now possible to automate much of the surveillance and be able to spot irregularities in streams of data about human life like never before.”

What happens to the myriad facets of our private lives — going to a therapy appointment, buying birth control, meeting a date — when it’s so easy to monitor us?

What happens when it’s our faces, not our phones, that are our trackers?

Eritrea, a small nation in the horn of Africa, is one place where the government’s approach to monitoring people remains decidedly 20th century. Only 2% of people have access to the internet, largely consisting of the urban elite. There’s little evidence the government is investing in the sophisticated surveillance systems of the kind China uses.

My friend Vanessa Tsehaye, an Eritrean-Swedish journalist and activist, believes deeply in grassroots campaigns for human rights in the country. A recent college grad, she spent her teenage years campaigning for the Eritrean government to free her uncle, the journalist Seyoum Tsehaye, from prison.

Tsehaye is the most relentlessly positive campaigner I know — but even she feels bleak thinking about the rise of the surveillance systems of the future.

“Their main methods of censorship are limiting access to the internet,” Tsehaye said. “Eritrea is the most censored country in the world, and despite that, people are slowly but surely mobilizing.”

“But if you add sophisticated surveillance tech,” she said, “the government could do whatever they wanted. It would destroy everything.”

Early this year, I met a Nicaraguan scholar at a conference and asked him about protests critical of President Daniel Ortega that had gripped the country. I was curious whether protesters there were concerned about facial recognition.

He told me to search “Nicaragua protests” on Google images. Sure enough, every photo showed demonstrators covering their faces with handkerchiefs and sunglasses.

There are many reasons besides facial recognition that protesters might like to cover their faces — tear gas being one of them — but regardless, masks have begun to show up in demonstrations all over the world. In Hong Kong this year, the government has even banned their use. It’s one way that people are coping with surveillance in the modern world.

Most demonstrators I’ve met in my time as a reporter are not activists who are willing to risk imprisonment for the causes they fight for. Rather, they are ordinary people with jobs, families, and responsibilities. I have wondered how the protest movements of the future would be possible in the presence of newly sophisticated surveillance tech. Would anyone be willing to complain about their leaders online, swap political texts with a friend, or go out and join a street protest if they knew they’d be immediately outed by an algorithm?

“I worry tremendously over whether human beings will have freedom in the future anymore,” said Human Rights Watch’s Wang. “We used to worry about the age of AI as robots annihilating humans like in science fiction. I think what’s happening instead is that humans are being turned into robots, with the sensory systems placed around cities that are enabling governments and corporations to monitor us continuously and shape our behavior.”

In some parts of the world, anti-surveillance campaigns have picked up steam as the technology has become more ubiquitous. Facial recognition bans are being discussed by politicians across the US, for instance, and the EU passed the GDPR in 2016, a sweeping set of rules aimed at the protection of personal data.

Citizens of authoritarian states, however, have fewer options. What many pro-privacy groups fear is a bifurcated world where citizens of democratic systems have privacy rights that far outpace those of people who live in authoritarian countries.

Eskandar, the Egyptian technologist, believes there is still room for optimism.

“Nonconformity was the fuel of the revolution,” he told me by phone. “I’ve seen it happen. A few people with very few resources have outmaneuvered a state apparatus — it’s happened time and time again. I really believe that people who are proponents of freedom rather than fascism can think freely. So there is hope.”●





Topics in this article

Skip to footer