Skip to main content Skip to secondary navigation
Page Content

The Tech Coup: A New Book Shows How the Unchecked Power of Companies Is Destabilizing Governance

In The Tech Coup: How to Save Democracy from Silicon Valley, Marietje Schaake, a Stanford HAI Policy Fellow, reveals how tech companies are encroaching on governmental roles, posing a threat to the democratic rule of law. 

Image
Marietje Schaake

Rod Searcey

In The Tech Coup: How to Save Democracy from Silicon ValleyStanford institute for Human-Centered AI Policy Fellow Marietje Schaake aims to raise public awareness about a serious threat: the unchecked power of private companies. In addition to exerting vast economic power, tech companies are stepping into roles normally preserved for governments and wreaking havoc on the democratic rule of law in the process. From cybersecurity to systems used for policing, elections, and military defense policy, tech companies large and small are playing an outsized role. 

Her book illustrates this thesis with examples drawn from her experiences as a member of the European Parliament from 2009 to 2019 and as an engaged observer of tech culture while a fellow at Stanford’s Cyber Policy Center and Stanford HAI.

Here, Schaake discusses why ceding power to tech companies hurts democracy and outlines a few solutions from her book.

In what ways are private companies increasingly taking on functions normally assumed by states?

In the digital realm, companies’ control of information, unfettered agency, and power to act have almost overtaken that of governments. 

cover of the book the tech coup

For example, in the private intelligence sector, companies like NSO Group Technologies with its Pegasus spyware products are creating and selling the capability to hack into people’s devices. This means that anyone with the financial resources to purchase Pegasus spyware can access the capabilities of intelligence services and hack into the very private information of political opponents, judges, journalists, critical employees, competitors, and others. 

Another striking example is that of offensive cyber capabilities. In the name of defending their clients or their networks, companies are attacking hackers across borders, using “offense as defense.”

And notice that I’m talking not only about big tech companies but also small ones, because there’s de facto power that comes from the development of digital technologies. Companies like NSO, or Clearview with its facial recognition software, or companies producing election technologies – these are not so-called Big Tech, but they are just as illustrative of some of the challenges that I’m pointing to, [such] as Elon Musk deciding who in Ukraine should and should not have access to Starlink internet connections.

So, we see that these capabilities, these decisions, and these powers that used to be the exclusive domain of the state are now seeping into the hands of private companies, but without the checks and balances that we want in societies that function by the rule of law.

What can be done to put democratic entities back in charge?

First of all, there needs to be more awareness and understanding of the ways in which companies exert power over governance, democracy, and international law. And then we need to bring the same level of legal clarity, accountability mechanisms, and transparency measures to the digital realm that we expect around other innovations such as medicines, chemicals, foods, cars, or even processes such as the decision to engage in foreign conflict. 

Let’s compare how the U.S. responded to the Ukraine war in the physical world versus in the cyber domain. As part of NATO, the U.S. is clear: It doesn’t want to see boots on the ground. But in the cyber domain, the U.S.’s offensive activities are ongoing. That political discrepancy can continue because of the legal gray zone in the digital realm.

So, I’m saying that the legal frameworks that we rely on in other industries – as well as the approaches we take to the physical side of foreign conflict – need to be on par in the digital realm and need to keep pace with where the technology has gone. It’s basically about catching up through international law, regulations, and enforcement, to make sure that the ideas that we have about what a democratic mandate looks like, what accountability looks like, and what oversight looks like are actually meaningful whenever activities happen in the digital sphere.

To some extent, you’re calling for implementing traditional types of laws and regulations in the digital realm, but is there also a way in which you’re calling for something new – a reinvention of democratic governments to handle the challenges of digitization?

Yes, there are a number of novel large- and small-scale changes I would recommend. For example, just as legislatures rely on independent legal teams to help draft legislation that will survive court challenges, they also need independent technology experts they can turn to for reliable information. Making tech expertise available to lawmakers would go a long way toward reducing lobbyists’ effectiveness and ensuring lawmakers understand how technology impacts issues like healthcare, education, justice, housing, and transportation. Because tech touches everything, lawmakers can’t get a handle on it by sitting in a single committee. They need to consult independent experts. 

Another example: Governments are increasingly outsourcing all kinds of processes to tech companies. And if a tech company operates in the name of a government, it should be as accountable as the government. I call this “the public accountability extension.” It sounds simple, but it would be a huge game changer. Right now, as governments outsource more and more critical governmental functions to tech companies, they also offload governmental accountability.

For example, in many jurisdictions, the police are not allowed to hack into the devices of suspects. Instead, they engage a hacking company that will do it for them. The police can then say they don’t hack, but they actually gain access by alternate means. Similarly, state and local governments in the United States aren’t constitutionally allowed to discriminate between citizens on the basis of sensitive categories, but oftentimes the technology they use does exactly that without being held accountable. A third example: Freedom of Information Act (FOIA) requests. Journalists have the right to know what the government does in citizens’ name, but sometimes governments hire private companies to do government work or to collect and store government information, and if the companies don’t keep the information at the standard required of public entities, or if the companies are reluctant to provide information due to proprietary concerns, then the effectiveness of FOIA erodes.

So, there are many examples of how governments essentially hide behind companies, including tech companies, to avoid accountability, and the accountability extension would address that.

In your book, you point out another tension: tech companies’ energy use in the communities they enter. You note that the public needs greater insight to provide greater oversight over the data centers underlying our digital lives. What do you mean?

There are no standards or reporting obligations requiring companies to say how much energy or water they’re using or plan to use. We have estimates from individual cases, but we don’t know the sum of data centers’ energy use.

And often, big tech companies seeking to build data centers in a community will bid for those projects under a pseudonym – a company name that hides the fact that Amazon, Google, or Microsoft is behind the bid. The lawyers and consultants that tech companies hire to make these bids often paint a rosy picture of all the economic benefits that will accrue to communities that allow data centers to be built. They do their best to hide critical information about who they are and the data centers’ energy needs. And this complete lack of transparency regarding the use of scarce resources hinders transparent and good governance.

We had some cases in the Netherlands where this was a big problem. Part-time city councilors had to make decisions about whether or not to allow a hyperscale data center in their territories. And they were up against billion-dollar companies and all their lawyers, accountants, consultants, and PR firms. The power asymmetry was enormous, and I think by standardizing the transparency and reporting requirements – including who is behind the projects and the metrics of energy use – this could become a much more fair public debate about the types of data centers a community can feasibly host.

This increased transparency doesn’t begin to address whether society even wants more of these data centers using up scarce energy resources in an era of climate change. But we can’t answer that cost-benefit question if we don’t know what the cost is and all we’re told is benefit, benefit, benefit in some shady presentation.

And of course, the other real issue, especially in the U.S., is the capability of the power grids. In the Netherlands, which is an advanced economy, as in the U.S. and the U.K., we’ve already seen reports that the grids are functioning at near-emergency levels: code red. The grids are stretched to their limits. They break down and outages are more frequent. And yet there are many data centers in the pipeline that were agreed to years ago. When they come online in two or three years, we may face a wave of disaster.

Can you talk about the precautionary principle that’s already part of EU law and how it might help rein in tech company power, especially AI?

Basically, the precautionary principle requires a pause to assess the societal impact of an innovation when it is new and before it is widely deployed or implemented. It’s enshrined in EU law but hasn’t been invoked by the authorities for tech innovations like AI because there’s such a push to use AI and also because the EU has now passed a separate law that deals with AI.

But I think the precautionary principle is a useful concept to address situations where tech company engineers are themselves surprised by the behavior of AI models, or where society and lawmakers and citizens don’t know or understand how a particular innovation will impact their lives. So we’d like to create processes that actually serve the public by using the precautionary principle to assess and research before unintended but still preventable problems spread too widely.

Have Americans been slow to understand the threat that tech companies pose to democracy?

Around the globe, there have been signs of the harms of tech companies’ power grab for a long time now, but Americans have tended to view them as distant events. People in Myanmar used Facebook to call for genocide in 2017. In 2018, we learned that Cambridge Analytica extracted and analyzed millions of Facebook users’ data and allowed political campaigns to use it for psychological profiling in ways that may have impacted the Brexit vote in the U.K. And then in 2020, there was the storming of the U.S. Capitol, which was in part fed by disinformation on social media that has led to tens of millions of Americans losing trust in the electoral process. So, some of the problems that people were facing elsewhere have come back to Americans quite late as a boomerang.

You invite all of us as democratic citizens to help shape an agenda that puts the survival of democratic principles ahead of short-term economic benefits. What would that look like for an ordinary citizen?

There are a lot of choices that consumers can make about how they want to use technology and services, but I don’t think that’s enough because there’s a huge asymmetry between the power of the individual internet or device user and the tech companies.

If people understand that the companies that offer all these entertaining, efficient, helpful digital tools are also engaged in power dynamics that affect capital and national and global politics, then they will also understand the need for independent oversight and countervailing powers, just as we have in almost every other aspect of our society.

And so I hope that people will bring their concerns to the political agenda and ask more from their leaders, especially in the U.S. I think a lot of people in the United States would like to see federal data protection law; protections for kids online; better cybersecurity; and an effective means of addressing disinformation about healthcare and elections, but there is no political majority in Congress to make these things happen. And the sad result is that people get defeatist and think it’s easier to just not raise these issues.

But it’s important for people to keep raising their concerns. And doing so at the state level is an interesting option because many states are now taking steps to pass laws and regulations around technology, given that the federal government can’t or won’t.

The Tech Coup: How to Save Democracy from Silicon Valley published in September from Princeton University Press.

More News Topics