thumbnail of Le Show; 2019-11-03
Transcript
Hide -
This transcript was received from a third party and/or generated by a computer. Its accuracy has not been verified. If this transcript has significant errors that should be corrected, let us know, so we can add it to FIX IT+.
From deep inside your audio device of choice, ladies and gentlemen, what the frack? The British government has banned fracking with immediate effect. Ministers of the cabinet also warned shale gas companies who would not support future fracking projects. The Guardian describes this as a crushing blow to companies that have been hoping to capitalize on the new frontiers of growth in the fossil fuel industry. The decision was taken after a new scientific study warned it was not possible to rule out quote, unacceptable, unquote, consequences for those living near fracking sites. Hello, Texas. Hello, Pennsylvania. The report undertaken by the British oil and gas authority, the OGA, also warned it was not possible to predict the magnitude of earthquakes fracking might trigger. Welcome to the party. The downturn in shale drilling has been so steep and brisk that oil-filled companies are taking the unprecedented step of scrapping entire fleets of fracking gear, according to Bloomberg.
Almost half of U.S. fracking equipment expected to be sitting idle within weeks, and shale specialists are retiring truck mounted pumping units and other equipment used to frack shale rock. Whereas in previous market slumps, frackers parked unused equipment to await a revival. This time it's different gear is being stripped down for parts, or sold for scrap. What the frack? And now. Here's maybe the apology the week from former Congresswoman Katie Hill of California. To every young person who saw themselves and their dreams reflected in me, I'm sorry. To those who felt like I gave them hope in one of the darkest times in our nation's history, I'm sorry. To my family, my friends, my staff, my colleagues, my mentors, to everyone who has supported and believed in me, I'm sorry.
To the thousands of people who spent hours knocking doors in the hot summer sun, who made countless phone calls, who sacrificed more than I could ever know, to give everything they could in every possible way, so that I could be here. I am so, so sorry. Food pictures of her had appeared in the British, quote, newspaper, unquote, the Daily Mail at the hands, she says, of her abusive ex. She also says she wasn't aware that the new photos were being taken. She was also accused of having an affair with a member of her staff when she was elected, and she admitted having an affair with one of her staff before she was elected. She's so sorry. The fast food chain McDonald's has apologized for a Halloween marketing campaign which promoted a dessert in Portugal with the words Sunday spelled like a dessert, Bloody Sunday. The term Bloody Sunday with a Y is used to refer to one of the worst days of the troubles in Northern Ireland.
13 people were shot dead by the UK Army. The advertisements sparked outrage on social media. What else happens on social media? McDonald's spokeswoman said the advertisements were designed as a Halloween celebration but the promotion has since been canceled. McDonald's did say the marketing campaign was not intended to be, quote, an insensitive reference to any historical event we sincerely apologize for any offense or distress this may have caused, said a spokeswoman. A photo of the advertisement with McDonald's branding was shared online by an Irish Twitter user who commented, quote, Portugal is canceled, unquote, that'll do it. The apologies of the week later, gentlemen, a copyrighted feature of this broadcast. We're re-broadcasting today an interview I did about six months ago with a woman who's written a book about the surveillance permeated world we're increasingly living in.
And my guest today is the author of a relatively new book on the subject, The Age of Surveillance Capitalism. She's Dr. Shashana Zuboff and she is currently with Harvard Business School. She's done a term at Harvard Law School as well. She's author of several books including In the Age of the Smart Machine, The Future of Work and Power and the Support Economy. And as I say, her latest book is The Age of Surveillance Capitalism, Dr. Zuboff, welcome. Thank you so much, Harry. It's a pleasure to be with you. It's a very dense book in terms of its conceptually dense and we'll deal with a few of those concepts. I'll just throw out two or three that I think are basic to understand where you're coming from, behavioral surplus and the prediction imperative and the drive of surveillance capitalism to achieve certainty through, among other means, behavior modification.
There's a spray of concepts. Try to knit them all together in the first little bit here, if you would please. Okay. Well, let's start with a definition of surveillance capitalism because that will give us a framework to do the knitting that we need to do. So surveillance capitalism, I argue, in many ways, diverges from the history of market capitalism, diverges in some ways that are not well recognized and pretty crazy, but there is a significant way in which surveillance capitalism emulates the age-old pattern of how capitalism has evolved. So let's talk about that for a moment. Capitalism typically evolves by taking things that live outside the marketplace, non-market, things or activities, bringing them into the market, turning them into what we call commodities
things that can be sold and purchased. So famously, industrial capitalism claimed nature for the market dynamic. And that means bringing it into the marketplace, turning it into raw materials that can be sold and purchased, real estate, land. Surveillance capitalism follows this pattern, but with a dark and unexpected twist. What it does is to unilaterally claim private human experience for the market dynamic. It claims private human experience as a free source of raw material to be translated into behavioral data. These data then are combined with advanced computational capabilities, machine learning, and they spit out products as all machines do.
But in this case, their products are analyses of behavioral data specifically geared to predict our future behavior. But we will do now, soon, and later. And these prediction products are then sold into a new kind of marketplace. This is a marketplace that trades exclusively in human futures. Trades exclusively in predictions of our future behavior. So maybe for some of our listeners what I've just described will sound a little bit science-fictioning. And the fact is, what I've just described perfectly corresponds to what we have come to think of as online targeted advertising. And in fact, as many people know, online targeted advertising invented at Google, so was Surveillance Capitalism.
And it was back in 2001 in the teeth of financial emergency as a result of the.com bust that Google's founders decided that if they were going to save their company, they had to get serious about advertising. They were under so much pressure, their very swanky venture capitalist investors were threatening to pull out everything that they had worked for was on the line. And before this, they had really had a pretty negative view of advertising. But if they were going to start to make some money and not be imploded by this financial emergency, it was going to be advertising. What they discovered was that, of course, they were getting behavioral data as people searched and browsed online. Probably they were using those data to improve the search engine and to create products like translation.
But there was extra data that was produced that they didn't need for service improvement. Those data, back at that time, they were called digital exhaust, great name, waste material. And who's going to criticize anybody for finding a way to recycle waste material? It's ecological. Exactly. So there had been a few folks in the company who had been experimenting with these leftover data. They were half-hazardly stored in data logs and servers. But the folks that had been playing around with this had come to understand that these leftover data had tremendous predictive value. So now the founders decided we're going to turn to these leftover data and instead of calling them leftover data, Harry, I call them behavioral surplus. It was a surplus because it was more than they needed to do their business well.
So they turned to now this behavioral surplus that was sitting around and began to examine it for its predictive qualities. And what happened was when they combined this behavioral surplus with their already even back in the early 2000s, Google was way ahead of the pack for its computational sophistication. Even back then, Larry Page, one of the founders, referred to their machines as R-A-I, artificial intelligence. So now you've got this behavioral surplus, which is full of the kind of things that some people now would call it metadata. Not so much the key word that you search for, perhaps. But what time of day and how long did you spend and how many words did you type in in order to finally get to the thing that satisfied your question or what you were looking for?
Later in the case of Facebook, we see behavioral surplus coming not just from what we disclose to Facebook. And it's worth saying this because a lot of folks are confused about this. When Facebook says, we'll let you see the data that we have on you. But what they mean there is that we'll let you see the data that you gave us. In other words, the information that you typed in at some point. So the information you're aware you gave to us? Exactly. But what the behavioral surplus comes in, Harry, is it's not just what you post. It's how you post. If you're making dinner plans with your friends for the evening, do you say I'll see you at 7.45 or do you say I'll see you later? Do you have run on sentences or do you use bullet points? If you're posting pictures, what's the level of saturation of those photos, what kinds
of colors, right? Do you use exclamation points? Well, then you could be president of the United States. These kinds of hidden behavioral signals actually can be used to predict your personality, your feeling, your sexual orientation, your political orientation. There's rooms full of research at this point that takes these tiny behavioral signals that I call behavioral surplus, and now can use them to predict almost anything about you, that you have no idea you are disclosing. These are used to construct personality profiles in our normal understanding that would be what they're putting together, right? Personality profiles, emotional cycles, get into a little bit more of this when we start talking about another part of your question, behavioral modification, but there's almost
nothing that researchers haven't tried to use these data to predict about you. So what Google discovered was that by using these behavioral surplus, doing the computations on these kinds of data, they could come up with predictions of a very specific future behavior. At that point, back in this originating context, that was a click-through rate. So it's easy to just think of it as a click-through rate, big deal, but that click-through rate is a prediction of a fragment of human behavior. That click-through rate was sold to a marketplace of business customers who want to know what we're going to do. Now those were advertisers, and they wanted to know where we're going to click. But the point is, those were the first markets established exclusively to sell predictions of our futures.
This was institutionalized at Google, it made a ton of money because now they sold advertisers these predictions instead of letting advertisers pick keywords and where you're going to place your ads, you bet on these predictions and you're going to make a lot of money. We're not going to let you into the black box that produces these predictions. You buy them, you'll make a lot of money, and that's exactly what happened. So between 2000, when this thing first got going, in 2004, when Google went public, and it's IPO documents, it's initial public offering documents, the world first got a glimpse of what had been going on at Google, it became clear that just in those few years, Harry, Google's revenue has had increased by 3,590%. From the strength of this logic that I've just described to you, what of course that means is that these guys set the bar for every investor in Silicon Valley.
Who wants to invest in a company that's going to create value the old fashioned way? When you can invest in a company that's going to make money by stealing our private experience with methods and mechanisms that right from the beginning, they understood had to be designed to keep us in ignorance. Right from the earliest patents, you see the data scientists from Google describing their ability now to hunt and capture behavioral surplus from all over the online world, all over the internet, and to find out things about us that we never intended to disclose, and they celebrate in those patents that they know how to do this in ways that are undetectable, untraceable, that bypass our awareness, you know, I call this the social relations of the one way mirror, Harry, and of course this is what puts the surveillance
in the surveillance capital because without the one way mirror, none of this would work because what we know from all the research done from the early 2000s onward, when folks actually find out what's going on behind stage here, they are appalled and they don't want any part of it, and they ask, how do I protect myself? How do I get away from this? Where are the alternatives? But of course, increasingly what's happened over these last 20 years is that it has become nearly impossible to get away from it, and this brings us to kind of the final bit of our story about the birth and growth of surveillance capitalism started at Google, migrated to Facebook with an executive named Cheryl Sandberg, who I call the Typhoid Mary of surveillance capitalism because she actually was hired at Facebook to bring these methods and this
economic logic to Facebook, which was floundering without it. From there it became the default option across the tech sector for the reasons I've said, you know, how are you going to get an investor to help you build a business the old-fashioned capitalist way, which is that you make something that customers really want, and you enhance the quality of life, of whole societies, whole populations, and you raise the standard of living and you employ people and so forth. No, no, no, no, that was going to take way too long to figure out how to make money that way, so this became the default model in the tech sector, but now we can no longer talk about this as confined to a couple of corporations or confined to an industry. This economic logic is traveling across the normal economy, it's in the insurance industry,
it's in the finance industry, education, healthcare, retail, now coming full circle back to some of those foundational corporations that make what used to be the big consumer goods, automobiles, appliances, and so now when you encounter these consumer goods, you know, the automobile with Amazon's Alexa personal digital assistant built into the dashboard, now Volvo has just announced it's building in cameras and all kinds of sensor devices into its dashboard to monitor behavior inside the car, Ford wants to switch its business model from making money, from selling cars, to making money, from streaming data from the 100 million people who are driving around in Ford vehicles, because they want to make money the way Facebook and Google make money, and that's what the CEO of Ford has publicly
said. So now this, this is becoming the dominant logic of capitalism that is sweeping through our economy, and every product that you encounter that begins with the word smart, and every service that you encounter that begins with the word personalized, is essentially a front and interface for these supply chains that are picking up behavioral surplus wherever they can, could be your dishwasher picking up your conversations in the kitchen, could be your child's doll that's picking up what are called dialogue chunks from your child's interaction with the doll, those dialogue chunks are getting sent to a third party company that analyzes those dialogue chunks for voice recognition that adds to the predictive artificial intelligence.
All of these things are interfaces for these vast and complex supply chains that now interface right at the edge of our private experience. Before you go on, I just want to interject that Cheryl Sandberg came to Facebook from Google. That was the nexus I think you were mentioning in the past. Yes, yes, sorry. She was a key Google executive who was part of the creation of this economic logic I call surveillance capitalism, yes. You do have probably the most absurd example of this process of inserting this dynamic into everyday life with Levi having introduced interactive denim at some point in the recent past. Yes. In collaboration with Google, so that this is Levi trying to make itself 21st century, so the tracking is actually woven into the fabric. Now as you quote the goal state of this product, interactive, yet authentic.
This has become a comprehensive and internally consistent economic logic defined by some critical imperatives, economic imperatives, and once you're an actor inside this box as it were, you know, you are compelled to make certain things happen. Certain kinds of practices are compelled. The design of systems in certain ways is compelled because of these economic imperatives. What I've discovered in my work is that once you understand the economic imperatives, it becomes easy to predict, for us now to predict, the crazy stuff that we see in the news every single day.
Oh, Facebook is trying to steal your emails a few weeks before that. There's a headline, oh, Google owns Nest, of course, which is a smart home device company, and somebody figured out that Nest's security, no less, Nest's security system, has a microphone buried in it that consumers were never told about. How did that happen? So, you know, every literally every day, there's at least one often multiple headlines, you know, so we start to feel like, what the heck is going on here? Well, the way to fix that, that feeling, is to understand the economic imperatives. And then, when you begin to read this stuff, it's like, oh, yeah, of course they're doing that, because that's totally an obvious and necessary activity in relation to these economic imperatives.
So nothing really surprises you, and it's very empowering then, because now we know that the deal here is a specific economic logic, you know, invented by some people, and a specific point in time under specific historical conditions. It's not like we're up against the entire world of digital technology. You make the point early on, and a couple times through the book, you introduce the concept of inevitableism that we are supposed to accept all of this as inevitable, there is no choice. T-I-N-A, there is no alternative. You present early on the case history of the aware home, which was full of sensors, but like certain other devices, which could theoretically exist, that monitor your fitness or other things, it existed in a closed loop, that is to say, it shared the information only with you and you on the information, that's an alternative universe now, isn't it? That has drifted away, it's melted into the mists of time.
Those were the assumptions of the data scientists and the engineers who wrote about smart homes and telemedicine, and all the things that we still care about today, but when they were writing about those things in the late 90s and the early 2000s, they simply assumed that these would be simple closed loops, and they said explicitly, obviously these are very intimate data, and of course it's going to be critical for them to be held privately. So the sensors in the home deliver the information to the occupants of the home, the sensors in the home that monitor your health, deliver only to the person who's the client, the patient, the individual, and if they want to share it with family members, that's their call, and of course their doctor as well. So we've come a long way from that, in fact, it's not even like we've just traveled
a road, we've crossed a gorge, maybe a grand canyon, so it's not like we can easily find our way back. I write about these wonderful scholars at the University of London who took some time out of their lives back in 2017 to do an analysis back to NEST now, the Google Smart Home Company, they did an analysis of one NEST thermostat, and they reckon that if you all want to buy a NEST thermostat, you really need to review the so-called privacy policies, at least 1000 of those privacy policies and contracts, at least 1000, because NEST streams to third parties, many of those domains owned by Google and Facebook, but others as well. Those third parties stream to third parties, those third parties of third parties stream
to third parties, and no one anywhere in the chain takes any accountability or responsibility for what those other third parties are going to do with your data. So you're basically broadcasting to the universe from your bedroom, there's just no way around that, and this is now a fundamental violation of things that are deep and elemental human needs, and when deep and elemental human needs, like the need for sanctuary, when these elemental needs are violated, that's when we have to change the conversation from needs to rights, because for example, we don't go around protesting for the right to take a deep breath, or the right to walk on my feet, because no one is trying to take those needs away from us.
Yeah, and in the 1970s, it felt like they were taking away the right to breathe. Yeah, okay, well said, but you see what I'm saying, no one is saying, I can't sit down, no one is saying I can't breathe, no one is saying I can't walk. So there's no need for me to fight for the right to walk, but now the need for sanctuary, the need for spaces that are backstage, that I control, that I get to make choices, that I choose what I do and what I will do next, and what I will do after that, that I choose how much privacy versus how much I want to share. That's all down to my autonomy, which is only in force when I have some kind of sanctuary, and now that need for sanctuary, where we all need a backstage place, where we take a breath, and we are, you know, I can be just myself. And even the great social psychologist, Irving Goffman, who wrote about the presentation
of self and everyday life, and life is a theater, we're all on stage to a certain extent throughout our lives, you know, I put in my cool sunglasses to go out the street, you know, whatever it might be, we're all on stage a little bit. But what makes us able to function on stage is that there's an off stage that we can go to, to take that deep breath, to just be with ourselves, to just be with our family, our friends, to just be what we feel like is our true selves. Well, this world that we're talking about now is in a direct assault to that kind of sanctuary. What I argue is that now we're in a fight for the right to sanctuary. Now we've talked about prediction as the essence of surveillance capitalism, and we talk about how these companies make money.
The way they're making money is by selling predictions to customers who want to know what we're going to do, remember this began with online targeted ads, and the better your predictions, the more money you make. And when we look at the Pioneer surveillance capitalist, like Google and Facebook, we know that 98% of their revenues come from online targeted ads, so this is pretty important stuff. All right, companies surveillance capitalists compete on the quality of their predictions. And the quality of their predictions, it turns out, if we kind of reverse engineer that just for a moment, it depends on just a couple of things. And those things help us understand the economic imperatives that define this whole space. So the first thing they figure out is to have good predictions. We need to have a lot of behavioral surplus, not a little bit, but a lot. And then even more than that, and the closer we can get to everything, the better off
we are, that urge toward totality. So this begins with extraction of our experience and its translation into data at scale. We need economies of scale. And so competing on scale, that worked for a little while, but of course, competition doesn't stay still, it's dynamic. And the next phase is, okay, we know we need a lot of behavioral surplus, but we also need varieties. It's not only scale. It's also scope. We need different kinds of surplus. We want you to get out of your laptop, out of your desktop. We want you to go out in the world now. We want to know where you are. We want to know what you're doing. We want to know who you're doing it with. We want to know where you're going, what you're buying, what you're eating. We want to know your conversations.
We want to know what you look like when you're walking down the street. What does your face look like because that will help us predict your emotions and your personality. We want to know whether you're stooped over or standing straight. We want to know your gate. We want to know the cadence of your speech, all critical sources of behavioral surplus. So we've got scale and now we've got economies of scope. This is a concept that you introduce as ubiquity. You ubiquity. Right. So this means that our sensors, our devices, join up with your phone, which is your computer in your pocket, to be everywhere. The surveillance capitalists have fought with the civil society organizations. They've fought with the government for the right to have their cheap sensors everywhere and anywhere they choose out in the world so that if you're walking down the street, they can have a picture of you walking on the street.
And they can have facial recognition built into their sensors. And so they can grab your face and they can grab your gate, they can grab your posture, they can grab all these things about you. And they have claimed, argued that they have the right to do that as their right to freedom of speech, they have the right to take these aspects of your experience without asking. And right now there is no law that impedes them. So yes, so ubiquity. Now final phase here, prediction continues, the competition continues to heat up more dynamism. Ultimately, one of the things they discover, Harry, is that the very best predictive data comes from actually intervening in your behavior, intervening in the state of play in your life, in order to tune and herd and coax, modify your behavior, to send you in particular directions that correspond to the commercial outcomes which they seek
on behalf of their clients, on behalf of their business customers in these futures markets. The closer their predictions can get to straight up observation, the more money they make. Well, you go back in time to somebody that I had studied and was appalled by in the dark ages, the earlier dark ages, BF Skinner, the inventor of behaviorism in psychology, who had posited that humans were nothing more than basically larger bipedal rats that could be trained with positive and negative reinforcement to engage in the desired behavior. Exactly. So human beings are just another kind of organism, like rats or beetles or whatever it might be. Human beings do not have an interior. From a Skinner's point of view, human beings are behavior, measurable behavior.
They are organisms that behave. And if we can modify the behavior by inducing with rewards or extinguishing with punishments, we can modify the behavior. We have perfect prediction. And ultimately, this is what the surveillance capitalists have embraced. And so all these things that we've talked about, how they use the ubiquitous digital surround to unilaterally claim our private experience, how they use their technologies to turn it into behavioral data, to compute the surplus from that data, to turn it into predictions, to create these new markets, these human futures markets, and sell that stuff. Everything here now becomes subordinate to this larger program, which I call the means of behavioral modification.
The means of behavioral modification is now the point of having all this digital stuff. This to herd our behavior as accurately and as effectively as possible so that their predictions will make even more money. So now we're talking about automated systems that modify behavior at the scale of populations. So it comes down to the individual, but we're not just doing this for you, Harry. Even though it's being personalized for me, right? Yes, it's being personalized for you, but you're not alone. We're all the recipients of this personalization, or well, was very fond of the whole idea of euphemisms and how power uses euphemisms to disguise its own workings. And so personalization is a euphemism in the Orwellian, the grand Orwellian tradition.
Personalization sounds good, but personalization really is now a form of behavioral conquest. So these operations were first experimented with, we first heard about it when Facebook published what it called its massive scale contagion experiments. The first one got published in a reputable scholarly journal in 2012. It turns out what they did was they used subliminal cues on your Facebook pages to see if they could get more people to go vote in the 2012 elections in the US. They weren't trying to tell you who to vote for, just to get more people to go vote. And by subliminal cues, I mean that they manipulated things in your newsfeed, they brought in like a picture of your friend, a message from your friend with the phrase, like, I voted
underneath the picture, somebody that they knew not only was in your network, but held a lot of influence in your network because these are things that they graph and that they know. They know your social graph, they know who are the high influencers in your social network and who are the, you know, far away acquaintances and so forth. But other kinds of online cues too, little things that are buried in the newsfeed or little things that are buried in a message. So that comes out in 2012 and nobody has any clue that the Facebook researchers in combination with academic colleagues, they're putting the finishing touches on a second contagion experiment and that one is published also in a prestigious journal in 2013. That one is about emotional contagion and in that one they use subliminal cues in your online Facebook pages to see if they could make some people feel happier and other people
feel sadder, manipulate your emotions, okay. In both of these scholarly articles, Harry, when the data scientists wrote up the research results, they celebrated two facts. One was that, hey, we now know that through the manipulation of subliminal cues in the online environment, we can change real world behavior and emotion. Hooray, we've cracked the code. Number two, they celebrated. We now know how to do this in a way that completely and utterly bypasses human awareness. You never know what hit you. So we may say, well, it's all well and good, online environment, but we can walk away from that.
We don't have to engage in that online environment. Not true, but sometimes we can tell ourselves that as a way, not feeling too horrible about this. Well, it turns out that Google was developing its own experimental lab on these population scale systems of behavioral modification, and its online laboratory came in the form of a game, a game called Pokemon Go, a game that people all over the world took pleasure in playing with their friends, with their families, with their children. Pokemon Go was incubated inside Google over many years. It was a lab inside Google called Niantic Labs, run by a man named John Hanky, who had been in charge of Google Street View, Google Maps, Google Earth. He had invented the precursor to Google Earth called Keyhole, originally purchased by the
CIA, and later purchased by Google and brought into Google and renamed Google Earth. So John Hanky is someone with a long history of mapping the world and mapping human behavior in the world, which is also something that Street View specialized in. All right. So now Hanky's created this game just before they brought the game out into the public. They spun off Niantic from Google and presented it as a small company, but Google remained the largest investor in this company. So this was Google born in Brad. So how did Niantic make money? At first Hanky said, and others, that they were making money when people buy little accessories and Google's for their Pokemon creatures, badges, and the little things that get you to
the next level of the game and that are fun to own and you can decorate your creatures with them and so forth. Turns out the way they were making money was they had established their own behavioral futures markets, their own markets selling future behavior. In this case, they're selling future behavior not to advertisers, but to real world establishments. McDonald's, Joe's pizza restaurant, the bar around the corner, service establishments where you might need to get your tires fixed. These establishments paid Pokemon Go a fee in return for footfall. Footfall means your real body with your real legs attached to it and your real feet attached to those legs, going someplace specific in your town, in your city so that your feet
fall on the floors of the establishments that are paying Pokemon Go for your presence to sit in their restaurant, to purchase in these service establishments, whatever it may be. And the way this fits into the game is Pokemon, the Niantic inserts characters that you're supposed to be searching for in these retail establishments. Is that the idea? Yes, so Niantic may put, for example, a Pokemon gym in the men's room of a bar or... And for people who really liked me missed out on the whole Pokemon saying, it superimposed augmented reality characters in this game on real world-mapped environments, correct? Correct, correct. So you go into a restaurant and you look at the restaurant through your phone and your phone imposes these augmented reality objects onto the actual environment in which you're
standing. So it may show, you walk into a bar and it may show little Pokemon creatures sitting on the tools of the bar. And in the context of the game, you get rewarded for contacting these creatures or finding. For finding the creatures and you rack up points for finding the creatures and the more points you rack up, you pass through levels of the game like most games, you're trying to get to the highest levels with the most number of badges and so forth and points. So what Niantic did, if you follow the logic here, when we began way back in early 2000, we said, okay, we're predicting the future with click-through. Well now click-through is footfall, right? Before we were letting your fingers do the walking, click-through, but now it's your actual body in the real world, your feet, in a real place.
It's behavior modification as gamification, which I think is a word you use. Indeed. We're using the logic of a game, which is of course based on rewards to herd populations in the direction of the outcomes that serve the interest of our business customers. That's a picture here. So that is Pokemon Go as a population scale experimental laboratory. Now, if you've read about Waze, W-A-Z-E, another Google application, Waze is doing the same thing. Waze is now agglomerating data about you, obviously it knows where you are and not only now does it tell you like, what's the best, the quickest way to get yourself home? But it says, hey, Harry, I know, you know, you've been looking for a new suit and you're
about to pass that store that's got exactly that kind of suit that you like. You know, why don't you make a detour there and, hey, I think they might even have some things on sale for you. All I can say is Waze thinks I'm looking for a new suit. We're safer than I imagine because anyway, that's me in suits. We have sort of numbed ourselves to a lot of this stuff. We figure it's the cost of doing business. We buy the idea that there are, that it's inevitable that this is just how this, you know, fish bowl digital world is meant to work. But that's not true at all. The digital is easy to imagine without these methods that reach into our personal experience and then, you know, turn our lives into commodities to be sold and purchased for someone else's profit.
This is a crazy kind of rogue parasitic form of capitalism that has been allowed to flourish in these last 20 years, largely because there's been no law to impede it, largely because it's been designed in ignorance and we didn't know what the heck was going on anyway. And then ultimately, because no matter how much we try to work our way around it, we're living in a time right now or essentially everything that's internet is undenoperated by surveillance capitalism and the alternatives have been foreclosed. So you're trying to get your test results from your doctor's office. You're trying to get your grades from your kid's school. You're trying to make plans with your family for dinner. You have no choice but to march yourself through the channels that are also surveillance capitalism supply chains. That's the fact of life right now. That's what the great philosopher Roberto Ungar calls the dictatorship of no alternatives. But this is not inevitable.
All we need is to get back to a place where we have laws that reign in the destructive excesses of capitalism as we have in the past. All we need to do is get back to a world in which democracy is in charge of regulating capitalism. So that the words market democracy do not sound like an oxymoron because right now surveillance capitalism is on a collision course with democracy. Are we in danger of believing Facebook's hype or the hype of these tech giants that they're more capable than they really are, that their metrics overstate their potency and their ability to modify or behave as in the case of Procter and Gamble saying, will you lie about how effective our ads were? Well, this creation of the means of behavioral modification automated processes that work
at the scale of populations, this is work in progress. This is part of the longer story of the digital architecture becoming ubiquitous and perfecting the ways in which we interface with that technology to be tuned and herded, perfecting the way to leverage social comparison dynamics in the network environment, which are a big part of these herding operations, you know, because people largely don't want to stand out from the crowd and want to be accepted and want to be part of the group. So the social comparison dynamics have a lot to do with it. But let's recall this, when John Hanky, the head of Niantic, the inventor of Pokemon Go, all the way back to Street View and Google Earth and so forth.
And John Hanky disclosed these population scale behavior modification systems that were Pokemon Go, talked about how they really work, guaranteed footfall to its business customers like McDonald's, paying Niantic to get folks into the restaurants. One of the very important things that he said when he described all this is that everybody's making money, everybody's making money. The customers, like the McDonald's and the Joe's Pizza, they're all making money, Niantic's making money, Google, our chief investors making money, everybody's making money, not the game players, but everybody else. So we underestimate at our peril. Do these folks hype it up a bit, of course they do, but I would say that's more of a phenomenon of fake it to you make it than it is a phenomenon of creating something that is hallucinatory.
Billions of dollars, we're talking about the huge market capitalization in these companies. Huge segments of that market capitalization are being invested in these systems. There's a Facebook memo, we rely a lot on leaked memos from these companies because that gives us a little bit of a window on what's going on. So there was a leak memo in 2017 written by Facebook executives for its business customers in Australia, New Zealand. They had so much data on 6 million adolescents and young adults in Australia and New Zealand. That they had learned how to predict, anticipate the emotional cycles of these young people as they move through the week. And they saw how the emotional cycles were characterized various phases of anxiety Monday through Thursday in anticipation of the weekend.
And then Friday Saturday and Sunday, their emotional cycles were more characterized by reflection. For example, gee, I wonder if she liked how I kissed her or how did I look on my date last night. So they now had ways to track the ups and downs of anxiety and the ups and downs of reflection. And signal their customers, look, if you've got a confidence boosting product, we can use these data now to signal maximum probability for when you send a message about your confidence boosting product, maximum probability, it will get purchased. Now everything I have just described to you, Harry, every single thing I have just said, you pivot that to degrees, not 100 degrees. The emotional behavioral surplus for micro-behavioral targeting for maximum effectiveness. You pivot that to degrees from commercial outcomes to political outcomes and you've just
invented Cambridge Analytica because that's exactly what Cambridge Analytica did. And the forensic analyses of Cambridge Analytica are still being done. We don't know exactly when or how much we're going to learn of exactly how many votes they shifted, but we have plenty of data to suggest their general efficacy. So Procter and Gamble, maybe they had some data, maybe they were tired of spending money, but these practices, when done correctly and powerfully, which is what these folks are figuring out how to do better and better every single day, will take you all the way through to powerful systems of behavioral modification that erode democracy from the
ground up because they challenge human autonomy. They challenge the basic precepts of human agency, which is the whole grounds for independent moral judgment, the whole grounds for self-determination, individuality, ultimately the whole grounds for freedom of will without which the very thought of a democratic society is unsustainable. So they erode democracy from the ground up, but they also erode democracy from the top down because now here we are entering the third decade of the 21st century with an institutional pattern of these companies, private capital, private surveillance capital, where they know everything about us, we know nothing about them. They know everything about us, we don't know about ourselves.
They know everything about us, but they use it for others' advantage, not for us. So this huge asymmetry of knowledge and the power to modify behavior, that accrues from that knowledge, this is not the empowerment and democratization that we expected from the digital age. This is returning us now to a pre-Guttenberg era, a futile era. Some people know everything, a lot of people know almost none of that. You have a mantra which occurs through your book, who knows, who decides, and who decides who decides. Which is a good place to stop and say, check out the age of surveillance capitalism by our guest today, Dr. Shashana Zuboff. Dr. Zuboff, thanks for being with me. Thank you so much for the opportunity, Harry. It's a pleasure. A typical of Shashapa does the San Diego desk, for just being there.
It's a pleasure to be with you, Dr. Shashana Zuboff, thanks for being with me, Dr. Shashana Dr. Shashana Zuboff, thanks for being with me, Dr. Shashana Zuboff.
Series
Le Show
Episode
2019-11-03
Producing Organization
Century of Progress Productions
Contributing Organization
Century of Progress Productions (Santa Monica, California)
AAPB ID
cpb-aacip-035a935c75c
If you have more information about this item than what is given here, or if you have concerns about this record, we want to know! Contact us, indicating the AAPB ID (cpb-aacip-035a935c75c).
Description
Segment Description
00:00 | Open/ What the Frack? | 01:35 | The Apologies of the Week : Congresswoman Katie Hill, McDonald's | 04:00 | Interview with Shoshana Zuboff, author of The Age of Surveillance Capitalism | 57:33 | 'The Continental' by The Conrad Salinger Orchestra /Close |
Broadcast Date
2019-11-03
Asset type
Episode
Media type
Sound
Duration
00:59:05.338
Embed Code
Copy and paste this HTML to include AAPB content on your blog or webpage.
Credits
Host: Shearer, Harry
Producing Organization: Century of Progress Productions
Writer: Shearer, Harry
AAPB Contributor Holdings
Century of Progress Productions
Identifier: cpb-aacip-05bdf125e3e (Filename)
Format: Zip drive
If you have a copy of this asset and would like us to add it to our catalog, please contact us.
Citations
Chicago: “Le Show; 2019-11-03,” 2019-11-03, Century of Progress Productions, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC, accessed December 22, 2024, http://americanarchive.org/catalog/cpb-aacip-035a935c75c.
MLA: “Le Show; 2019-11-03.” 2019-11-03. Century of Progress Productions, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Web. December 22, 2024. <http://americanarchive.org/catalog/cpb-aacip-035a935c75c>.
APA: Le Show; 2019-11-03. Boston, MA: Century of Progress Productions, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Retrieved from http://americanarchive.org/catalog/cpb-aacip-035a935c75c