Featured Book
The Age of Surveillance Capitalism:
The Fight for a Human Future at the New Frontier of Power
PublicAffairs, 2019
Par Shoshana Zuboff
Report by John Zada
“Surveillance Capitalism” is a new paradigm of free enterprise whereby big tech corporations seek to convert all human experience into data, and create wealth by predicting, influencing and controlling human behaviour at scale.
It has been a little over two decades since an obscure digital systems architecture known as the Internet, intimated itself, almost innocuously, into our lives. In that time scarcely a day goes by, or even hours, when most of us aren’t engaged in some form of online activity. The technology has brought us undreamed of conveniences and benefits, while also hoisting upon us a whole host of Faustian costs: from screen addiction to the erosion of our social relations, to the incessant tower of babble bickering driving an ever-contentious political polarization. As artificial intelligence (AI) reaches new pinnacles of machine learning, spawning unprecedented consumer tools that will change of our lives for the better and worse, the ledger of pros and cons of digital technology continues to accrue.
Shoshana Zuboff, Professor Emerita at Harvard Business School, argues in her most recent book, The Age of Surveillance Capitalism, that a new form of exploitation, the likes of which humans have never before seen, has attached itself to big tech’s ubiquitous online services. The term she coins for it, “surveillance capitalism” describes a novel paradigm of free enterprise which seeks to convert all human experience into data, and create wealth by predicting, influencing and controlling human behaviour at scale.
Zuboff tells us that most players in the big tech industry are involved in surveillance capitalist activities that are “run by a narrow priesthood of privately employed computational specialists” and which she describes as “an original sin of simple robbery.” Google, Facebook, Microsoft and scores of other smaller companies, many of whom create the apps we download onto our phones, have learned how to mine and monetize the information available and inferable from our profiles and online behaviour.
More than a few of us now are aware that big tech is taking and selling some of our personal information to third parties. Facebook leaks about its collection of our private data over the years—around what we “share” and “like” on its platform—have helped inform us in that regard. But Zuboff tells us that not only are these more obvious pursuits of surveillance capitalism just the tip of the iceberg, they are also the least important and profitable of their activities.
It isn’t so much that surveillance capitalism takes the explicit information you knowingly provide online: your location, your birthday, the posts you endorse with a “like.” That information, Zuboff tells us, is secondary. The information about you that is most prized is the data these companies have learned to infer from, or is implied by, your online behaviour.
Google realized a few years after going public that the real money was in mining what it called “digital exhaust” or what Zuboff terms “behavioural surplus”—the residual data that exists between the lines of your online life. It may be the fact that you like mountains—because the Rockies jut out in the background of a photo you post. Or that you’re a night owl because you post frequently after midnight. Facebook also discovered that understanding users’ personality was the key to accurately predicting what users want. And understanding personality, in turn, came less from knowing what content people shared—but instead how it was shared.
Surveillance capitalism’s personality profiling for the auto insurance industry provides a perfect example. “We are not scrutinized for substance but for form,” Zuboff writes. Where our social media use is concerned, judging personality “does not derive from what you write but how you write it. It is not what is in your sentences but in their length and complexity, and not what you list but that you list, not the picture [you post] but the choice of filter and degree of saturation, not what you disclose but how you share or fail to, not where you make plans to see your friends but how you do so: a casual ‘later’ or a precise time and place? Exclamation marks and adverb choices operate as revelatory and potentially damaging signals of your self.”
Zuboff reveals that the photographic street mapping project known as Google Street View (a function of Google Maps), which allows users to access street images at any address, became a covert effort to mine yet more of this “digital exhaust.” The roving cars that Google Street View used to photograph so many of the world’s towns and cities were also capturing the behavioural surplus information that appeared at or near each address (the cars in the driveway, the kinds of trees or flowers in the front yard). The Google vehicles also clandestinely helped themselves to digital data from the unsecured routers in the homes it passed. This brings to mind a recent article by historian Jill Lepore in the New Yorker, “The Data Delusion,” in which she writes, “Commerce in the 21st century is espionage for profit.”
Surveillance capitalist companies sell this data to other parties who do what they want with it. But more importantly, they use it themselves to offer us suggestions online about what they think we want to spend our money on: places to go, items to buy, content to consume—thereby steering our behaviour towards predictable guaranteed commercial outcomes which they can again monetize later in other ways. These are suggestions we might not otherwise conceive, thus making us pawns in a larger game.
For example, users of Pokémon GO—an “augmented reality” treasure-hunt style video game for your smartphone where players walk around a city trying to find digital loot for points—have no idea they’re being steered towards businesses that are paying Google money to herd people to their “sponsored locations.” The game is a Trojan horse, a ruse, by which the corporate partners behind the game, monetize peoples’ behaviour.
These “economies of action” or “behaviour modification” operations, as Zuboff calls them, are core to big tech’s “prediction imperative.” Their ultimate goal, she tells us over and over again in the book, is to steer all of us to where it thinks we should go. And it is largely succeeding.
“Surveillance capitalists’ interests have shifted from using automated machine processes to know about your behaviour to using machine processes to shape your behaviour according to their interests,” Zuboff writes. “In other words, this decade-and-a-half trajectory has taken us from automating information flows about you to automating you. Given the conditions of increasing ubiquity, it has become difficult if not impossible to escape this audacious, implacable web.”
Surveillance Capitalism: Beyond the Internet and Social Media
This prediction imperative—to intervene in our state of play towards the creation of certainty and monetizable guaranteed outcomes—has set its sights beyond the internet and social media. Surveillance capitalists, Zuboff alleges, now seek to access our behavioural trends and data through any and all possible means in the physical world. They seek to make all objects and environments collection and tracking nodes of our behaviour—new milieus for predicting and shaping our future behaviour.
The idea behind such grandiose and utopian projects like “smart homes,” “smart cities” and “the internet of things” is to provide yet more of this data. Making things “smart” by adding sensors to them, like your vacuum cleaner, mattress, thermostat, alarm system, toothbrush, coffee mug, oven, baby’s nursery, and even contact lenses are all meant to confer benefits for us. A “smart mirror” in your bathroom will let you surf online as you groom yourself, but may also record your facial biometrics to gage your emotional state to cater predictive advertisements to you in the moment. In spite of their perks, what these “smart” technologies are really doing is rendering yet more of this digital exhaust from our most intimate spaces, and with our mostly forced and uninformed consent.
For Zuboff this amounts to an absurd overreach in pursuit of “digital omniscience” in which the physical world, and all the lives and things that exists within it, are transformed into one massive data collection field.
“In the flatness of this flow, data are data, and behaviour is behaviour,” she writes. “The body is simply a set of coordinates in time and space where sensation and action are rendered into data. All things animate and inanimate share the same existential status in this blended confection, each reborn as an objective and measurable, indexable, browsable, searchable ‘it’.”
Why Understanding Surveillance Capitalism Matters
Surveillance capitalism is deeply problematic, Zuboff argues, for numerous reasons.
For one, it is both unethical and undemocratic. It is practiced without society’s awareness and consent; and without any accountability for what is done with the information it takes. Because this economic paradigm has no known precedent, it is invisible and unrecognizable. It lulls us into further complacency by the fact that we are getting free or low-cost services (Zuboff coins the technical terms she does in the book to bring these actions into awareness and thus greater visibility and comprehensibility).
Big tech companies unilaterally assert their rights to this information and make us sign online “agreements” in return for their services: take it-or-leave propositions that cannot be amended by us in the way contracts everywhere and always have been allowed to be negotiated. If this isn’t concern enough, its methods can be hijacked, as in the case of the 2018 Cambridge Analytica data scandal. That debacle revealed that Cambridge Analytica, a British Consulting firm, harvested the Facebook data of 87 million users for several years and sold it to political strategists in the US who used it in targeted online ads in the American presidential election of 2016. Zuboff writes that the company “merely reoriented the surveillance capitalist machinery from commercial markets in behavioural futures towards guaranteed outcomes in the political sphere.”
Ultimately, for Zuboff, surveillance capitalism erodes our autonomy, dignity, agency, and freewill as human beings to author our futures without coercion.
Zuboff writes: “If industrial civilization flourished at the expense of nature and now threatens to cost us the Earth an information civilization shaped by surveillance capitalism will thrive at the expense of human nature and threatens to cost us our humanity.”
What Can We Do to Bring Surveillance Capitalism to Heel?
Zuboff calls on us to “reject the Faustian pact of participation for dispossession that requires our submission to the means of behavioural modification.” Awakening to a sense of indignation and outrage is a necessary step, she says, to regaining agency.
Because a healthy, functioning democracy threatens surveillance revenues, and because robust human awareness, higher consciousness—the conscious evolution advocated by writers such as Robert Ornstein and Idries Shah—endangers the project of behaviour modification, Zuboff suggests that each of us become the “friction” in the machine: a sort of resistance in the digital works, which frustrates big tech’s dreams of complete omniscience.
At an individual level that might include reducing or relinquishing our digital dependence by deleting our apps, or turning off their tracking and alerts. Disposing of our smart phones altogether is an even more effective gambit.
More important are the collective moves at the societal level. Countries need to enact laws that reject the fundamental legitimacy of surveillance capitalism’s declarations upon our information, and which interrupt its basic operations.
Doing so, Zuboff says, “would signify a withdrawal of social agreement to surveillance capitalism’s aims and methods in the same way that we once withdrew agreement to the antisocial and antidemocratic practices of raw industrial capitalism” that righted the balance of power between employers and workers in the 20th century.
However, left to its own devices, surveillance capitalism will continue to “tune, herd, and condition” our behavior with evermore subtle and subliminal cues that steer us towards its most desired outcomes which we wrongly believe are our own.
Human agency, “our right to the future tense,” she says, hangs in the balance.