Le Show; 2019-06-02
- Transcript
From deep inside your audio device of choice. Ladies and gentlemen, principles are good. Politics are better. That's the message we got this week when Senate Majority Leader Republican Mitch McConnell who had stood a thwart and blocked President Obama's nomination to the Supreme Court because it was an election year, a presidential election year. Announce this week that he won't do that if there's a nomination to the Supreme Court next year because it's not Obama. Meanwhile, you may remember, if you're old enough, around the time of the Iraq War, the House of Representatives cafeteria changed the name of French fries to freedom fries this week. The Energy Department announced approving a liquefied natural gas project in Texas, saying it would allow, quote, molecules of U.S. freedom to be exported to the world. The Department said the permit is critical to, quote, spreading freedom gas throughout the world.
Unquote, hello, welcome to the show. Old freedom's just another word for clean energy. Energy's just another word for jobs. Used to call gas natural, but that ticked off the greens. Only God can guess what twirls their knobs. Freedom gas, molecules of freedom,
liberty that comes out through a hose. Best in class, just being you and me, then we're gonna go and kick the whole world's ass. When we send them shitloads, a freedom gas, a shit shitloads son. Oh, freedom gas is what you get when you frack up a storm, making dirty energy go clean.
Because it comes out of a nozzle, doesn't mean the fuel is fossil. Don't the people have the right to say what is green? What is green? Freedom gas, let's be moldy-puller, some fuel to free our friends and foil our fools. Let's raise a glass, there's so much more than wind and solar, we can make it smell as sweet as new moon crusts. It's smooth sailing, with less inhaling. Burn and gigatons of freedom gas, freedom gas. This is La Show, and this will be the second in a series of two programs, exploring to put it mildly the darker side of the internet-permated universe
we've been living in for the past couple of decades or so. And my guest today is the author of a relatively new book on the subject, The Age of Surveillance Capitalism. She's Dr. Shashana Zuboff, and she is currently with Harvard Business School. She's done a term at Harvard Law School as well. She's got several books including in the age of the smart machine, the future of work and power, and the support economy. As I say, her latest book is The Age of Surveillance Capitalism. Dr. Zuboff, welcome. Thank you so much, Harry. It's a pleasure to be with you. It's a very dense book in terms of its conceptually dense, and we'll deal with a few of those concepts. I'll just throw out two or three that I think are basic to understand where you're coming from. Behavioral surplus and the prediction imperative and the drive of surveillance capitalism to achieve certainty
among other means, behavior modification. There's a spray of concepts. Try to knit them all together in the first little bit here if you would please. Okay, well, let's start with a definition of surveillance capitalism because that will give us a framework to do the knitting that we need to do. So, surveillance capitalism, I argue, in many ways, diverges from the history of market capitalism, diverges in some ways that are not well recognized and pretty crazy, but there is a significant way in which surveillance capitalism emulates the age-old pattern of how capitalism has evolved. So, let's talk about that for a moment. Capitalism typically evolves by taking things that live outside the marketplace, non-market things or activities,
bringing them into the market, turning them into what we call commodities, things that can be sold and purchased. So, famously, industrial capitalism claimed nature. For the market dynamic. And that means bringing it into the marketplace, turning it into raw materials that can be sold and purchased. Real estate, land. Surveillance capitalism follows this pattern but with a dark and unexpected twist. What it does is to unilaterally claim private human experience for the market dynamic. It claims private human experience as a free source of raw material to be translated into behavioral data. Those data then are combined with advanced computational capabilities, machine learning, and they spit out products as all machines do.
But in this case, their products are analyses of behavioral data specifically geared to predict our future behavior. What we will do now, soon and later. And these prediction products are then sold into a new kind of marketplace. This is a marketplace that trades exclusively in human futures, trades exclusively in predictions of our future behavior. So, maybe for some of our listeners what I've just described will sound a little bit science-fiction-y. The fact is what I've just described perfectly corresponds to what we have come to think of as online targeted advertising. And in fact, as many people know, online targeted advertising invented at Google, so was surveillance capitalism.
And it was back in 2000-2001 in the teeth of financial emergency as a result of the dot-com bust that Google's founders decided that if they were going to save their company they had to get serious about advertising. They were under so much pressure, their very swanky venture capitalist investors were threatening to pull out everything that they had worked for was on the line. And before this they had really had a pretty negative view of advertising. But if they were going to start to make some money and not be imploded by this financial emergency, it was going to be advertising. What they discovered was that, of course they were getting behavioral data as people searched and browsed online. Mostly they were using those data to improve the search engine and to create products like translation.
But there was extra data that was produced that they didn't need for service improvement. Those data back at that time they were called digital exhaust, great name, waste material. And who's going to criticize anybody for finding a way to recycle waste material? It's ecological. Exactly. So there had been a few folks in the company who had been experimenting with these leftover data. They were half-hazardly stored in data logs and servers. But the folks that had been playing around with this had come to understand that these leftover data had tremendous predictive value. So now the founders decided we're going to turn to these leftover data and instead of calling them leftover data, Harry, I call them behavioral surplus. It was a surplus because it was more than they needed
to do their business well. So they turned to now these, this behavioral surplus that was sitting around and began to examine it for its predictive qualities. And what happened was when they combined this behavioral surplus with their already even back in the early 2000s, Google was way ahead of the pack for its computational sophistication. Even back then Larry Page, one of the founders, referred to their machines as R-A-I, artificial intelligence. So now you've got this behavioral surplus, which is full of the kind of things that some people now would call it metadata. Not so much the key word that you search for, perhaps. But what time of day and how long did you spend? And how many words did you type in
in order to finally get to the thing that satisfied your question or what you were looking for? Later in the case of Facebook, we see behavioral surplus coming not just from what we disclose to Facebook. And it's worth saying this because a lot of folks are confused about this. When Facebook says, we'll let you see the data that we have on you. So what they mean there is that we'll let you see the data that you gave us. In other words, the information that you typed in at some point. Right. The information you're aware you gave to us? Exactly. But where the behavioral surplus comes in Harry is it's not just what you post. It's how you post. If you're making dinner plans with your friends for the evening, do you say I'll see you at 7.45 or I'll see you later?
Do you have run on sentences or do you use bullet points if you're posting pictures? What's the level of saturation of those photos? What kinds of colors? Right. Do you use exclamation points? Well, then you could be president of the United States. These kinds of hidden behavioral signals actually can be used to predict your personality, what you're feeling, your sexual orientation, your political orientation. There's rooms full of research at this point that takes these tiny behavioral signals that I call behavioral surplus and now can use them to predict almost anything about you. That you have no idea you are disclosing. These are used to construct personality profiles and our normal understanding that would be what they're putting together, right? Personality profiles, emotional cycles,
get into a little bit more of this when we start talking about another part of your question, behavioral modification. But there's almost nothing that researchers haven't tried to use these data to predict about you. So what Google discovered was that by using these behavioral surplus, doing the computations on these kinds of data, they could come up with predictions of a very specific future behavior. At that point, back in this originating context, that was a click-through rate. So you see to just think of it as a click-through rate, you know, a big deal. But that click-through rate is a prediction of a fragment of human behavior. That click-through rate was sold to a marketplace of business customers who want to know what we're going to do. Now, those were advertisers and they wanted to know where we're going to click.
But the point is those were the first markets established exclusively to sell predictions of our futures. This was institutionalized at Google. It made a ton of money because now they sold advertisers these predictions instead of letting advertisers pick keywords and where you're going to place your ads, you bet on these predictions and you're going to make a lot of money. We're not going to let you into the black box that produces these predictions. You buy them, you'll make a lot of money. And that's exactly what happened. So between 2000, when this thing first got going in 2004, when Google went public and it's IPO documents, its initial public offering documents, the world first got a glimpse of what had been going on at Google, it became clear that just in those few years, Harry, Google's revenue has had increased by 3,590%. On the strength of this logic that I've just described to you,
what of course that means is that these guys set the bar for every investor in Silicon Valley. Who wants to invest in a company that's going to create value the old-fashioned way? When you can invest in a company that's going to make money by stealing our private experience with methods and mechanisms that right from the beginning, they understood had to be designed to keep us in ignorance, right from the earliest patents. You see the data scientists from Google describing their ability now to hunt and capture behavioral surplus from all over the online world, all over the internet, and to find out things about us that we never intended to disclose. And they celebrate in those patents that they know how to do this in ways that are undetectable, untraceable, that bypass our awareness.
You know, I call this the social relations of the one-way mirror, Harry. And of course this is what puts the surveillance in the surveillance capital, because without the one-way mirror none of this would work. Because what we know from all the research done from the early 2000s onward, when folks actually find out what's going on behind stage here, they are appalled and they don't want any part of it. And they ask, how do I protect myself? How do I get away from this? Where are the alternatives? But of course increasingly, what's happened over these last 20 years is that it has become nearly impossible to get away from it. And this brings us to kind of the final bit of our story about the birth and growth of surveillance capitalism started at Google, migrated to Facebook with an executive named Cheryl Sandberg, who I call the Typhoid Mary
of surveillance capitalism, because she actually was hired at Facebook to bring these methods and this economic logic to Facebook, which was floundering without it. From there it became the default option across the tech sector for the reasons I've said. How are you going to get an investor to help you build a business the old-fashioned capitalist way, which is that you make something that customers really want. And you enhance the quality of life of whole societies, whole populations, and you raise the standard of living and you employ people and so forth. No, no, no, no. That was going to take way too long to figure out how to make money that way. So this became the default model in the tech sector, but now we can no longer talk about this as confined to a couple of corporations or confined to an industry.
This economic logic is traveling across the normal economy. It's in the insurance industry, it's in the finance industry, education, health care, retail, now coming full circle back to some of those foundational corporations that make what used to be the big consumer goods, automobiles, appliances. And so now when you encounter these consumer goods, you know the automobile with Amazon's Alexa, personal digital assistant, built into the dashboard. Now Volvo has just announced it's building in cameras and all kinds of sensor devices into its dashboard to monitor behavior inside the car. Ford wants to switch its business model from making money from selling cars to making money from streaming data from the 100 million people who are driving around in Ford vehicles.
Because they want to make money the way Facebook and Google make money. And that's what the CEO of Ford has publicly said. So now this is becoming the dominant logic of capitalism that is sweeping through our economy. And every product that you encounter that begins with the word smart, and every service that you encounter that begins with the word personalized, is essentially a front-end interface for these supply chains that are picking up behavioral surplus wherever they can. Could be your dishwasher, picking up your conversations in the kitchen. Could be your child's doll that's picking up what are called dialogue chunks from your child's interaction with the doll. Those dialogue chunks are getting sent to a third-party company that analyzes those dialogue chunks for voice recognition that adds to the predictive
artificial intelligence. All of these things are interfaces for these vast and complex supply chains that now interface right at the edge of our private experience. Before you go on, I just want to interject that Cheryl Sandberg came to Facebook from Google. That was the nexus I think you were mentioning. Yes, yes, sorry. She was a key Google executive who was part of the creation of this economic logic I call surveillance capitalism. Yes. You do have probably the most absurd example of this process of inserting this dynamic into everyday life with Levi having introduced interactive denim at some point in the recent past. Yes. In collaboration with Google. So, you know, this is Levi trying to make itself 21st century. So, the tracking is actually woven into the fabric. Now it's, as you quote,
the goal state of this product interactive yet authentic. This has become a comprehensive and internally consistent economic logic defined by some critical imperatives, economic imperatives. And once you're an actor inside this box as it were, you know, you are compelled to make certain things happen. Certain kinds of practices are compelled. The design of systems in certain ways is compelled because of these economic imperatives. And what I discovered in my work is that once you understand the economic imperatives, it becomes easy to predict, for us now, to predict. The crazy stuff that we see in the news every single day.
Oh, Facebook is trying to steal your emails. A few weeks before that, there's a headline, oh, Google owns Nest, of course, which is a smart home device company. And somebody figured out that Nest's security, no less, Nest's security system has a microphone buried in it that consumers were never told about. How did that happen? So, you know, every literally every day, there's at least one often multiple headlines, you know. So we start to feel like what the heck is going on here? Well, the way to fix that, that feeling, is to understand the economic imperatives. And then, when you begin to read this stuff, it's like, oh yeah, of course they're doing that, because that's totally an obvious and necessary activity
in relation to these economic imperatives. So nothing really surprises you and it's very empowering then. Because now we know that the deal here is a specific economic logic, you know, invented by some people and a specific point in time under specific historical conditions. It's not like we're up against the entire world of digital technology. You make the point early on and a couple times through the book, you introduce the concept of inevitableism that we are supposed to accept all of this as inevitable. There is no choice. There is no alternative. You present early on, the case history of the aware home, which was full of sensors, but like certain other devices, which could theoretically exist, that monitor your fitness or other things, it existed in a closed loop,
that is to say, it shared the information only with you and you own the information. That's an alternative universe now, isn't it? That is. That has drifted away. It's melted into the mists of time. Those were the assumptions of the data scientists and the engineers who wrote about smart homes and telemedicine and all the things that we still care about today. But when they were writing about those things in the late 90s and the early 2000s, they simply assumed that these would be simple closed loops and they said explicitly, obviously these are very intimate data and of course it's going to be critical for them to be held privately. So the sensors in the home deliver the information to the occupants of the home, the sensors in the home that monitor your health, deliver only to the person who's the client, the patient, the individual. And if they want to share it with family members,
that's their call. And of course they're their doctor as well. So we've come a long way from that. In fact, it's not even like we've just traveled a road. We've crossed a gorge, maybe a Grand Canyon. So it's not like we can easily find our way back. I write about these wonderful scholars at the University of London who took some time out of their lives back in 2017 to do an analysis back to NEST now, the Google Smart Home Company. They did an analysis of one NEST thermostat. And they reckon that if you all want to buy a NEST thermostat, you really need to review the so-called privacy policies. At least 1,000 of those privacy policies and contracts. At least 1,000. Because NEST streams to third parties.
Many of those domains owned by Google and Facebook, but others as well. Those third parties stream to third parties. Those third parties of third parties stream to third parties. And no one anywhere in the chain takes any accountability or responsibility for what those other third parties are going to do with your data. So you're basically broadcasting to the universe from your bedroom. There's just no way around that. And this is now a fundamental violation of things that are deep and elemental human needs. And when deep and elemental human needs, like the need for sanctuary, when these elemental needs are violated, that's when we have to change the conversation from needs to rights. Because, for example, we don't go around protesting for the right to take a deep breath. Or the right to walk on my feet.
Because no one is trying to take those needs away from us. In the 1970s, it felt like they were taking away the right to breathe. Yeah, okay. Well said. But you see what I'm saying. No one is saying I can't sit down. No one is saying I can't breathe. No one is saying I can't walk. So there's no need for me to fight for the right to walk. But now, the need for sanctuary, the need for spaces that are backstage, that I control, that I get to make choices, that I choose what I do and what I will do next and what I will do after that, that I choose how much privacy versus how much I want to share. That's all down to my autonomy, which is only in force when I have some kind of sanctuary. And now, that need for sanctuary, where we all need a backstage place, where we take a breath,
and we are, you know, I can be just myself. And even the great social psychologist, Irving Goffman, you know, who wrote about the presentation of self and everyday life, and life is a theater. We're all onstage to a certain extent, throughout our lives, you know, I put in my cool sunglasses to go out in the street, you know, whatever it might be. We're all onstage a little bit. But what makes us able to function onstage is that there's an offstage that we can go to to take that deep breath, to just be with ourselves, to just be with our family, our friends, to just be what we feel like is our true selves. Well, this world that we're talking about now is in a direct assault to that kind of sanctuary. What I argue is that now, we're in a fight for the right to sanctuary. Now, we've talked about prediction
as the essence of surveillance capitalism, and when we talk about how these companies make money, the way they're making money is by selling predictions to customers who want to know what we're going to do, remember this began with online targeted ads, and the better your predictions, the more money you make. And when we look at the pioneer surveillance capitalist, like Google and Facebook, we know that 98% of their revenues come from online targeted ads, so this is pretty important stuff. All right, companies, surveillance capitalists, compete on the quality of their predictions. And the quality of their predictions that turns out, if we kind of reverse engineer that just for a moment, it depends on just a couple of things, and those things help us understand the economic imperatives that define this whole space. So the first thing they figure out
is to have good predictions. We need to have a lot of behavioral surplus, not a little bit, but a lot, and then even more than that. And the closer we can get to everything, the better off we are, you know, that urge toward totality. So this begins with extraction of our experience, and it's translation into data at scale. We need economies of scale. And so competing on scale, that worked for a little while, but of course, competition doesn't stay still, it's dynamic. And the next phase is, okay, we know we need a lot of behavioral surplus, but we also need varieties. It's not only scale, it's also scope. We need different kinds of surplus. We want you to get out of your laptop, out of your desktop. We want you to go out in the world now. We want to know where you are. We want to know what you're doing. We want to know who you're doing it with.
We want to know where you're going, what you're buying, what you're eating. We want to know your conversations. We want to know what you look like when you're walking down the street. What does your face look like? Because that will help us predict your emotions and your personality. We want to know whether you're stooped over or standing straight. We want to know your gate. We want to know the cadence of your speech. All critical sources of behavioral surplus. So we've got scale, and now we've got economies of scope. This is a concept that you introduce as ubiquity. You ubiquity. Right, so this means that our sensors, our devices, join up with your phone, which is your computer in your pocket, to be everywhere. The surveillance capitalists have fought with the civil society organizations. They've fought with the government for the right to have their cheap sensors everywhere and anywhere they choose out in the world.
So that if you're walking down the street, they can have a picture of you walking down the street. And they can have facial recognition, built into their sensors. And so they can grab your face and they can grab your gate. They can grab your posture. They can grab all these things about you. And they have claimed, argued that they have the right to do that. As they're right to freedom of speech, they have the right to take these aspects of your experience without asking. And right now, there is no law that impedes them. So yes, so ubiquity. Now, final phase here. Prediction continues. The competition continues to heat up. More dynamism. Ultimately, one of the things they discover Harry is that the very best predictive data comes from actually intervening in your behavior. Intervening in the state of play in your life,
in order to tune and herd and coax, modify your behavior, to send you in particular directions that correspond to the commercial outcomes which they seek on behalf of their clients, on behalf of their business customers in these futures markets. The closer their predictions can get to straight-up observation, the more money they make. Well, you go back in time to somebody that I had studied and was appalled by in the dark ages, the earlier dark ages, BF Skinner, the inventor of behaviorism in psychology, who had posited that humans were nothing more than basically larger bipedal rats that could be trained with positive and negative reinforcement to engage in the desired behavior. Exactly.
So human beings are just another kind of organism, like rats or beetles or whatever it might be. Human beings do not have an interior. From a Skinner's point of view, human beings are behavior, measurable behavior. They're organisms that behave. And if we can modify the behavior by inducing with rewards or extinguishing with punishments, we can modify the behavior. We have perfect prediction. And ultimately, this is what the Surveillance Capitalists have embraced. And so all these things that we've talked about, how they use the ubiquitous digital surround to unilaterally claim our private experience, how they use their technologies to turn it into behavioral data, to compute the surplus from that data, to turn it into predictions,
to create these new markets, these human futures markets, and sell that stuff. Everything here now becomes subordinate to this larger program, which I call the means of behavioral modification. The means of behavioral modification is now the point of having all this digital stuff. It's to herd our behavior as accurately and as effectively as possible, so that their predictions will make even more money. So now we're talking about automated systems that modify behavior at the scale of populations. So it comes down to the individual, but we're not just doing this for you, Harry. Even though it's being personalized for me, right? Yes, it's being personalized for you, but you're not alone. We're all the recipients of this personalization
or well was very fond of the whole idea of euphemisms and how power uses euphemisms to disguise its own workings. And so personalization is a euphemism in the or well in the grand or well in tradition. Personalization sounds good, but personalization really is now a form of behavioral conquest. So these operations were first experimented with. We first heard about it when Facebook published what it called its massive scale contagion experiments. The first one got published in a reputable scholarly journal in 2012. Turns out what they did was they used subliminal cues on your Facebook pages to see
if they could get more people to go vote in the 2012 elections in the US. They weren't trying to tell you who to vote for, just to get more people to go vote. And by subliminal cues, I mean that they manipulated things in your newsfeed. They brought in like a picture of your friend, a message from your friend with the phrase, like, I voted underneath the picture. Somebody that they knew not only was in your network, but held a lot of influence in your network because these are things that they graph and that they know. They know your social graph, they know who are the high influencers in your social network and who are the, you know, far away acquaintances and so forth. But other kinds of online cues too, little things that are buried in the newsfeed or little things that are buried in a message. So that comes out in 2012 and nobody has any clue that the Facebook researchers in combination
with academic colleagues, they're putting the finishing touches on a second contagion experiment. And that one is published also in a prestigious journal in 2013. That one is about emotional contagion. And in that one they use subliminal cues in your online Facebook pages to see if they could make some people feel happier and other people feel sadder. Manipulate your emotions. Okay. In both of these scholarly articles, Harry, when the data scientists wrote up the research results, they celebrated two facts. One was that, hey, we now know that through the manipulation of subliminal cues in the online environment, we can change real world behavior and emotion. Hooray, we've cracked the code. Number two, they celebrated.
We now know how to do this in a way that completely and utterly bypasses human awareness. You never know what hit you. So we may say, well, it's all well and good, an online environment, but we can walk away from that. We don't have to engage in that online environment. Not true, but sometimes we can, you know, we can sort of tell ourselves that as a way, like not feeling too horrible about this. Well, it turns out that Google was developing its own experimental lab on these population scale systems of behavioral modification, and its online laboratory came in the form of a game, a game called Pokemon Go, a game that people all over the world took pleasure in playing with their friends, with their families,
with their children. Pokemon Go was incubated inside Google over many years. It was a lab inside Google called Niantic Labs, run by a man named John Hankey, who had been in charge of Google Street View, Google Maps, Google Earth. He had invented the precursor to Google Earth, called Keyhole, originally purchased by the CIA, and later purchased by Google and brought into Google and renamed Google Earth. So John Hankey is someone with a long history of mapping the world and mapping human behavior in the world, which is also something that Street View specialized in. Alright, so now Hankey's created this game just before they brought the game out into the public. They spun off Niantic from Google and presented it as a small company.
But Google remained the largest investor in this company. So this was Google born in Brad. So how did Niantic make money? At first, Hankey said in others, that they were making money when people buy little accessories and Google's for their Pokemon creatures. Badges. Badges and the little things that get you to the next level of the game and that are fun to own and you can decorate your creatures with them and so forth. Turns out the way they were making money was they had established their own behavioral futures markets, their own markets selling future behavior. In this case, they're selling future behavior not to advertisers, but to real world establishments. McDonald's. Joe's pizza restaurant. The bar around the corner. Service establishments where you might need to
get your tires fixed. These establishments paid Pokemon Go a fee in return for footfall. Footfall means your real body with your real legs attached to it and your real feet attached to those legs. Going someplace specific in your town in your city so that your feet fall on the floors of the establishments that are paying Pokemon Go for your presence to sit in their restaurant to purchase in these service establishments whatever it may be. And the way this fits into the game is Pokemon, the Niantic inserts characters that you're supposed to be searching for in these retail establishments. Is that the idea? Niantic may put, for example, a Pokemon gym in the men's room of a bar. For people who really liked me
missed out on the whole Pokemon saying it superimposed augmented reality characters in this game on real world mapped environments. Correct. Correct. You go into a restaurant and you look at the restaurant through your phone. And your phone imposes these augmented reality objects onto the actual environment in which you're standing. So it may show you walk into a bar and it may show little Pokemon creatures sitting on the tools of the bar. And in the context of the game, you get rewarded for contacting these creatures or find them. You get rewarded for finding the creatures and you rack up points for finding the creatures and the more points you rack up, you pass through levels of the game like most games you're, you know, you're trying to get to the highest levels with the most number of badges and so forth and points. So what Niantic did, if you follow the logic here,
when we began way back in early 2000, we said, okay, we're predicting the future with click-through. Well, now click-through is footfall. Right? Before we were letting your fingers do the walking, click-through. But now it's your actual body in the real world, your feet, in a real place. It's behavior modification as gamification, which I think is the word you use. Indeed. So now we're using the logic of a game, which is, of course, based on rewards to herd populations in the direction of the outcomes that serve the interest of our business customers. That's a picture here. So that is Pokemon Go as a population scale experimental laboratory. Now, if you've read about Waze, W-A-Z-E, another Google application,
Waze is doing the same thing. Waze is now agglomerating data about you. Obviously, it knows where you are. And not only now, does it tell you, like, what's the best, the quickest way to get yourself home? But it says, hey, Harry, I know, you know, you've been looking for a new suit and you're about to pass that, that store that's got exactly that kind of suit that you like. You know, why don't you make a detour there? And, hey, I think they might even have some things on sale for you. All I can say, if Waze thinks I'm looking for a new suit, we're safer than I imagine, because anyway, that's me in suits. We have sort of numbed ourselves to a lot of this stuff. We figure it's the cost of doing business. We buy the idea that there are, that it's inevitable, that this is just how this, you know, fishbowl, digital world is meant to work. But that's not true at all.
The digital is easy to imagine without these methods that reach into our personal experience and then, you know, turn our lives into commodities to be sold and purchased for someone else's profit. This is a crazy kind of rogue, parasitic form of capitalism that has been allowed to flourish in these last 20 years, largely because there's been no law to impede it. Largely because it's been designed in ignorance and we didn't know what the heck was going on anyway. And then, ultimately, because, no matter how much we try to work our way around it, we're living in a time right now, or essentially everything that's internet is owned and operated by surveillance capitalism. And the alternatives have been foreclosed. So you're trying to get your test results from your doctor's office. You're trying to get your grades from your kid's school. You're trying to make plans with your family for dinner. You have no choice but to march yourself
through the channels that are also surveillance capitalism supply chains. That's the fact of life right now. That's what the great philosopher Roberto Ungar calls the dictatorship of no alternatives. But this is not inevitable. All we need is to get back to a place where we have laws that rein in the destructive excesses of capitalism as we have in the past. All we need to do is get back to a world in which democracy is in charge of regulating capitalism. So that the words market democracy do not sound like an oxymoron. Because right now surveillance capitalism is on a collision course with democracy. Are we in danger of believing Facebook's hype or the hype of these tech giants that they're more capable than they really are? That their metrics overstate their potency
and their ability to modify our behavior as in the case of Proctor and Gamble saying, will you lied about how effective our ads were? Well, this creation of the means of behavioral modification automated processes that work at the scale of populations, this is work in progress. This is part of the longer story of the digital architecture becoming ubiquitous and perfecting the ways in which we interface with that technology to be tuned and hearded, perfecting the way to leverage social comparison dynamics in the networked environment, which are a big part of these hurting operations, you know, because people largely don't want to stand out from the crowd and want to be accepted
and want to be part of the group. So the social comparison dynamics have a lot to do with it. But let's recall this, when John Hanky, the head of Niantic, the inventor of Pokemon Go, all the way back to Street View and Google Earth and so forth, when John Hanky disclosed these population scale behavior modification systems that were Pokemon Go, talked about how they really work, guaranteed footfall to its business customers, like McDonald's paying Niantic to get folks into the restaurants. One of the very important things that he said, when he described all this, is that everybody's making money. Everybody's making money. The customers, like the McDonald's is and the Joe's Pizza, they're all making money. Niantic's making money. Google, our chief investors making money.
Everybody's making money. Not the game players, but everybody else. So we underestimate at our peril. Do these folks hype it up a bit? Of course they do. But I would say that's more of a phenomenon, of fake it to you, make it. Then it is a phenomenon of, you know, creating something that is hallucinatory. Billions of dollars is the huge market capitalization in these companies. Huge segments of that market capitalization are being invested in these systems. There's a Facebook memo. We rely a lot on leaked memos from these companies because that gives us a little bit of a window on what's going on. So there was a leaked memo in 2017, written by Facebook executives and business customers in Australia, New Zealand. They had so much data
on six million adolescents and young adults in Australia and New Zealand. That they had learned how to predict, anticipate, the emotional cycles of these young people as they move through the week. And they saw how the emotional cycles were characterized by various phases of anxiety Monday through Thursday in anticipation of the weekend. And then Friday Saturday and Sunday their emotional cycles were more characterized by reflection. For example, gee, I wonder if she liked how I kissed her or how did I look on my day last night. So they now had ways to track the anxiety in the ups and downs of reflection. And signal their customers. Look, if you've got a confidence boosting product, we can use these data now to signal maximum probability
for when you send a message about your confidence boosting product, maximum probability, it will get purchased. Now, everything I have just described to you, Harry, every single thing I have just said, you pivot that two degrees, not 100 degrees, the emotional behavioral surplus for micro-behavioral targeting for maximum effectiveness. You pivot that two degrees from commercial outcomes to political outcomes. And you've just invented Cambridge Analytica because that's exactly what Cambridge Analytica did. And the forensic analyses of Cambridge Analytica are still being done. We don't know exactly when or how much we're going to learn of exactly how many votes they shifted. But we have plenty of data to suggest their general efficacy.
So proctor and gamble, maybe they had some data, maybe they were tired of spending money, but these practices, when done correctly and powerfully, which is what these folks are figuring out how to do, better and better every single day, will take you all the way through to powerful systems of behavioral modification that erode democracy from the ground up because they challenge human autonomy. They challenge the basic precepts of human agency, which is the whole grounds for independent moral judgment, the whole grounds for self-determination, individuality, ultimately the whole grounds for freedom of will, without which the very thought of a democratic society is unsustainable. So they erode democracy from the ground up,
but they all also erode democracy from the top down because now here we are entering the third decade of the 21st century with an institutional pattern of these companies, private capital, private surveillance capital, where they know everything about us, we know nothing about them. They know everything about us, we don't know about ourselves. They know everything about us, we use it for others' advantage, not for us. So this huge asymmetry of knowledge and the power to modify behavior that accrues from that knowledge, this is not the empowerment and democratization that we expected from the digital age. This is returning us now to a pre-Gutenberg era, a futile era. Some people know everything, a lot of people know almost none of that. You have a mantra which occurs through your book,
who knows, who decides, and who decides, who decides, which is a good place to stop and say, check out the age of surveillance capitalism by our guest today, Dr. Shoshana Zuboff. Dr. Zuboff, thanks for being with me. Thank you so much for the opportunity, Harriet. It's a pleasure. A typical of Shoshan Po to Jeffrey Talbot at AudioWorks in New Orleans and Brian Broughton Busys, sound recording and Los Angeles for help with today's program as well as thanks as always to Pam Haustead and Thomas Walsh at WWNO New Orleans. I'll come back to you again next week. Try to talk about this same time. Me time, remember, principle is good. Politics is better.
I'm Steve from Century of Progress Productions and originates through the facilities of WWNO New Orleans flagship station of the Change Is Easy Radio Network. So long, from the home of the homeless. . .
- Series
- Le Show
- Episode
- 2019-06-02
- Producing Organization
- Century of Progress Productions
- Contributing Organization
- Century of Progress Productions (Santa Monica, California)
- AAPB ID
- cpb-aacip-a41b2f22dd5
If you have more information about this item than what is given here, or if you have concerns about this record, we want to know! Contact us, indicating the AAPB ID (cpb-aacip-a41b2f22dd5).
- Description
- Segment Description
- 00:00 | Open/ Molecules of US freedom | 01:14 | 'Freedom Gas' by Harry Shearer | 04:08 | Interview with Shoshana Zuboff, author of The Age of Surveillance Capitalism | 57:40 | 'Bad Kids To The Back' by Snarky Puppy /Close |
- Broadcast Date
- 2019-06-02
- Asset type
- Episode
- Media type
- Sound
- Duration
- 00:59:05.338
- Credits
-
-
Host: Shearer, Harry
Producing Organization: Century of Progress Productions
Writer: Shearer, Harry
- AAPB Contributor Holdings
-
Century of Progress Productions
Identifier: cpb-aacip-081e569ec36 (Filename)
Format: Zip drive
If you have a copy of this asset and would like us to add it to our catalog, please contact us.
- Citations
- Chicago: “Le Show; 2019-06-02,” 2019-06-02, Century of Progress Productions, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC, accessed December 22, 2024, http://americanarchive.org/catalog/cpb-aacip-a41b2f22dd5.
- MLA: “Le Show; 2019-06-02.” 2019-06-02. Century of Progress Productions, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Web. December 22, 2024. <http://americanarchive.org/catalog/cpb-aacip-a41b2f22dd5>.
- APA: Le Show; 2019-06-02. Boston, MA: Century of Progress Productions, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Retrieved from http://americanarchive.org/catalog/cpb-aacip-a41b2f22dd5