thumbnail of Inflection Point with Lauren Schiller; #103; How Search Engines Reinforce Racism & Sexism - Dr. Safiya Umoja Noble
Transcript
Hide -
This transcript was received from a third party and/or generated by a computer. Its accuracy has not been verified. If this transcript has significant errors that should be corrected, let us know, so we can add it to FIX IT+.
i think it's actually a gentrifying women something that's been with us for a long time and the industries that made a lot of money off of that and now those same practices are happening in an environment like that and in google search and learned solar and that's today an inflection point in coming out the point i i i am i
am and the boom bang bang bang news bigelow and search engines in general as a second brain is there anyone i remember something trivial a disney man that movie or that title of that song stuck in my head if they were doing research on a guest or when i'm talking with my protection to add another article i read me the answer to a question considering the fact that people process is forty thousand search queries every second which translates to one point two trillion searches per year worldwide i googled that stuff either way it's safe to say that nearly everyone with an internet connection or a mobile phone is adopted search engines that membrane empowering using this back up
brain is backfiring on as one of our alliance in search engines is perpetuating the press of ideas and hateful ideology even sway an election between those mundane his information my guest dr sophie hare emerge and noble was in library information school when she began to notice something that made her extremely uncomfortable everybody was talking about blue ball like it was a new public library said that was the first mom and wires like babe that's interesting city had worked in marketing for fifteen years before returning to grad school to study library science and it always seemed to go for what it really is an advertising platform and we're the searchers questioners advertisers are combination paying to optimize content painterly content visible and then people clicking on that content which signals it's credible for viable so what is this really mean
and history is poor or ideologies can take over say words and identities and communities that's right the information you see in this search results are heavily manipulated through this strange complicated digital doubt also found out one day a colleague of the theaters dr andre broad editor of attack suggested she giggled and when she did she saw that the top search results were images that perpetuated negative stereotypes such an expert and it wasn't just buy from supportive of what you know brought the chamber of an email for all fit into victims of being horrified and google search results i know the temptation is to do that for your staff but thankfully it went public with their field has changed the results of that
we think of the internet as the great equalizer but the algorithms behind it created by humans is serving up content that is sexist and racist and the highest that discovery it was the beginning of an investigation that eventually became the syfy is the algorithms of oppression how search engines reinforce racism dark as of the end of old joined me from a studio at the university of southern california annenberg school of communication where she's an assistant professor first i did a very systematic study and a whole host of different identities and i used to date a census category is as the wades you kind of have a starting point for thinking about various of racial and ethnic categories and then you're pairing the us way of boys girls men and women and so there were you know over eighty different combinations that i looked
at and people often ask me of course what happens when you search for why girls or are white boys are white men and i find that to be one of their interesting phenomenon because you know usually you'll get something like either white checks the movie by the way and brothers which was very popular concert for white girls for a long time but you might get other things like objects your white girls dresses which meaning get a girls' dresses that are the color white so whitey becomes that kind of that descriptor it's not an ethnic or cultural descriptor it's more of like a color associated with various kinds of objects so that's i think interesting and part of that is because as we now in the united states most white americans don't think of themselves as white they just think of themselves as americans or as their own kind specific identity so that's you know something that i try to think through the complexities of that
but i was also interested in certain kinds of concepts so for example if he did a search at for a long time on the word beautiful beautiful listen on a man's in google image search almost exclusively with images of white women on and again you don't have to add the words when men to the word beautiful but conceptually google images framed beautiful as a fan when men who fit their current standard beauty contemporary beauty standard so i looked in the book i kind of specific identities but then i looked at other kinds of concepts and one of the things they always have found interesting is that for many years have talked about the specific example beautiful and hi fi that something like nature would be represented as their concept of beauty at least first in google image search and over the years i've noticed a google has actually changed its algorithm and now when you do a search on beautiful you often get the
church so i find it an interesting kind of quiet riot violent relationship between google and i and her inspirational resentment some of its critics writer the criticisms levied not just by me of course i need people like jesse daniels see that it often you read it where great but the globalization of everything and why we should worry jesse daniels were great book called cyber racism mean they've talked about this phenomenon for example for many years doing a search on the word jew and being led to the holocaust denial sites or anti semitic site and that you know that's but i really trace even the kind of contemporary moments where that happened in on a variety of different searches that just a co opted you there by white supremacists or by large industries like the porn industry that well but i mean the basic question your but i think friends and a perfectly which is to say is it well for neglect or isn't a profit imperative that is making
money for racism and sexism and who stands to benefit from having these images come out when the person searching is theoretically a surging and what they think is it just is an amazing well as you said a new public library right i mean one of the things they always try to stress to people is that google search is non public information portal into a library or public library it's really an advertising platform and you can optimize a lot of content i knew it and google's main interest is as optimizing content for its advertisers for its clients for people who pay it to make their contents visible that the entire premise of google search and so that we don't understand how profitable racism and sexism are is one of the reasons why i try
at his storage sites some of these things unstable that before we had the internet racial stereotypes we're really profitable in other places like hollywood sexism has been really profitable use actually inject a fine women is something that's been with us for a long time and the industries that made a lot of money now off of that and now those same practices are happening in an environment like in the internet and in google search and not just google i mean i couldn't really take any big commercial search engine and it says that google is the monopoly leader and everyone else of trying to be like a bad study who sets the tone for everyone else who's in the ecosystem i learned seller my guest is dr cynthia noble an assistant professor at the annenberg school of communication at the university of southern california her book is the algorithms of a russian subscribe to the book simply podcast for more stories of how women rise only come back and look
under the hood at how google works and our search results can be manipulated to manipulate us i'm lauren shuler and this isn't a
collection point my guest is dr cynthia noble is an assistant professor at annenberg school of communication university of southern california and her book algorithms or this you new data marking background who don't really understand handle world's cutest things are put in a request in that there's you know that the results can you to explain a little bit about how those search results do get pushed to the top in rankings and you know optimization and all these terms that marketers fling about that the everyday person may have no clue about so there's a primary mechanism by which google i was looking into communities that's making a lot of content for its clients now a lot of people think they you know you and i everyday people that we're google's customers but we are not because we don't really pay google mean that's how you become a customer large company is big industry is and even
everyday people who might use its advertising mechanism which is called outwards and inky each year and basically a twenty four by seven real time auction to pay for certain words to be associated with your contact in essence what they're saying is well out there the next person to make sure that when these key words are used they're connected to my content right so if you're a large company for example let's say you're a big automotive company will unite hired thousands of small little science from all kinds of marketing campaigns that you've done all kinds of events in your time as well as the commercials all of the brain's activity and so you are in essence are linking all of your state back to a key domain you have a big footprint so to speak and optimizing a lot of content and this is why you often find very large companies and big industries who hit the first page of search
results when you're looking for something that's why and the reasons why even in the media industry as you often define large corporate have multinational or national media organizations will show up on the first page because there's a lot of content and did they are also trying to make sure their content is visible some people are both looking at that content and when they click on that content that sends a signal i'm going back into their ecosystem that this content might be legitimate are viable it's popular and it there was you know kind of come into play together and i guess i think of this the third element which is the people who have a lot of technical skill who really know how to maximize your things that we call meditate at reagan betting she worries they're deeply into their architecture i've always say sit at that hopes those sites to become more visible to go and it's this confluence of the relationship between money
and popularity and technical skill that i try to make it more electable for people so that they do understand their multiple prostheses happening and it's not just a matter of what's popular is what we see or what's most credible is what we see and google says that they optimize for over two hundred defiant and features were qualities that they're concerned with and i guess the question for me is how is it that on certain types of decisions and values are in play like we don't watch out sexual exploitation to surface bubble a black face come for a lap one hundred of the country earl ray or less racist armed or discriminatory kind served disinformation cultural but without we make sure that like animal mutilation doesn't come through all those are the kinds of questions that i think help us i was a gay union the complexity of these environments and more poorly what
values are at play you know i mean when is the argument for that that you know we live in a society that allows free speech sent you know to do anything different would be censorship because sexism and racism at all other allies against it are not explicit wade eyerly go child sex exploitation is and what what is the argument for not leaving those things out of the search results but this opens up a whole new area as a conversation but i'm super interested in somebody you asked this question and i mean the first thing is all of a large tech platforms on the internet are deeply invested in this idea that they are not publishers they're not responsible for the content that leads through their platforms they are simply kind of technical architecture through which people are speaking and they are not responsible for the things that people speak or deal on the platforms that this is really important it's an important distinction for
these companies because one they're very invested in a kind of libertarian notions of anything goes on the internet and there is no censorship and two if they become responsible for the kind of content that flows to their platforms while now are in trouble because they really don't have the capacity to moderate all of the content that has come through and i think here about people like aig are professor sara roberts at ucla who has done groundbreaking research on the people who look at the most horrific content and screen it out before it makes its aid to youtube or facebook or in to sir rich and secure their massive armies of people who were precarious low paid workers for the most part disperse globally who have to look at some of their separate their content andy do you screen it out and you know you can use the
word science or you could say screen you could see a moderate but what's important here now is that decision making is happening all the time in these platforms and content is coming out all the time in these platforms and say oh gosh from the gate they are not free speech sounds but i think that's something that we need to understand we also being understand that the companies are invested in us thinking that their free speech sounds because otherwise they have to take greater responsibility for the kind of content for example they've got to be responsible when disinformation campaigns knew through their platforms that maybe undermined democracy or phone that ethnic cleansing or genocide around the world i mean many of these companies are not in a position to deal with the volume of content that's moving through their platforms and so they stay anchored to this idea that it's an anything goes and they're not responsible it's really the
public that's responsible as i was reading your buck and thinking about the way that search results come up and the information that we get when we're looking to learn something and that the tunnel just his kind i met is that it's sort of like it's the canary in the coal mine but it feels like it's just the beginning of what can go wrong if ai is allowed to be incorporated too much into our lives way i think that you know that word was difficult to write the kinds how i think there are you know many guys you were studying these different platforms and their consequences the baton for vulnerable people for democracy you can have for the future of education and information which are really important
the elements of a democracy now having people her educate and i think there are other kinds of unforeseen consequences like what does it mean about me i've come to believe that there isn't the answer it can't be found in point zero three seconds and begin a search engine can provide it needed that we have universities we have schools people go through a long process from kindergarten through twelfth grade and have their fortunate to go to college and they have those experiences because becoming educated i'm learning is a nuanced iterative process it and it's part of human experience there and search engines getting as an intern point zero two seconds really truncates our thinking about what it is to know what it is to learn what it is to kind of make sense of our world and those are some of the things that i notice just more anecdotally as a professor with my own students who will write their papers from a search
engine and not from the library and two i often use you misinformation face or property and i think they can tell the difference from times and that kind of critical thinking ability to discern your in for you right information and this information and that from knowledge it is good all the time because for cannonball information like what's an actor i can remember ok great but you know when you start getting really comfortable with the banal you will take it to the more sophisticated and people the last more sophisticated questions of search engines and you know the book i wrote a whole chapter about dillan roof for example trying to make sense of the trayvon martin and george zimmerman case and we know that ultimately in his own words that you knew he went to a whole host of these convoys sometimes the states
and then it maybe not know we are we don't we don't really know if it but on some level that information until the home ultimately he opened fire and nine african americans by a black church in charleston south carolina so it is important to think about you know not just a banal but the more sophisticated and nuanced questions and then the kinds of horrible conversation quite frankly that comeback to answer those questions and i think over time we're going to see that the stakes will be high for warning this kind of shortcuts and not taking the time to time to know better to think differently to appreciate science and philosophy the arts i mean it you just couldn't lean on a crutch a surgeon gen for or i'm coming to understand our world and that's really kind of the bigger project that i'm trying to communicate in this book
i'm light show and this is an inflection point for talking with dr cynthia nobody the author of algorithms repression toxic ideologies are polluting our information sources at a rapid pace and it's not reserved to places like ability to cope other items a power some of the most significant decision making mechanisms in our society are doing in essence learning to pay to buy into the system instead of eradicating it i think it's worth talking about other ways that algorithms are impacting our lives and inspectors concept of your searches just that that the tip of the spear in some ways so for example here in california air eight no money bail lot was just passed and instead what they're going to do is have more reliance on running on the risk assessment
using an algorithm to determine whether the person has been arrested should be released or held in the local jail while they await their court dates andy murray ran tests on these things to show that it's totally reassessed that that you know if you're aware and an idaho target money as if you're poor and me and your person call you're much more likely to be detained because for whatever whoever designs under them is biased yes well one of the things that's happening here in terms of all of these sentencing and risk assessment types of software that are being used in me and judicial system of criminal justice system is there but for the most part the datasets that are being used are based on historical data collection that they'd become the training data so to speak for these risk assessment of automated decision making systems and you know it's really important that when we think about artificial intelligence in her that sounds like a really fun sexy it out but it's
just an automated decision making system and it's programmed and it's trained on data sets may times there's data sets art and deeply flawed and say why not just have the bias of the jurors or the programmers but it might be kind of them their lack of understanding how those datasets reforms and they're kind of historical and social politics and relationships they're embedded in them i mean what's interesting to me about these risk assessments and projects is couture pointed out also annie you don't see these risk assessments are appointed for example at wall street who is most likely to defraud the country with it new betting against mortgages and cause a recession get caught at dartmouth is going to go to wall street anti fraud the country raises like notes no shade to dartmouth and just
saying it have to look out who these tools are pointed toward and of course that i think is suspect or to inform them and this is a part where i think you know this relationship between scholars are studying medicine and research that me and many others are trying to do and winking that applicant ended super critical tax literacy algorithmic literacy thank you we may be calling for one set of solutions to intervene upon vulnerable people only to replace a big systems that are even harder to intervene upon and one of the things we are very worried about is the opacity of these algorithms and of the software and how difficult it will be but for a giant or anyone with low levels of detail and scientific literacy we die or even understand its bias and horses disproportionate an unfair impact on our society so we're do you think i'm trying to think about how odd
everything be aired discovering and the public awareness of what is actually going on how would we use everyday people karen hughes that awareness too and i thought we can't beat the machine re like we can be the savviest searchers in the world that worry for still being served up information and that is going to without or even rise huge change the way we think about certain things how what can the individuals deal with where we are right now well i think one of the first things that we can and should expect is that our information environments should not be deeply polluted you know more than you expect it in the water and air to be polluted mean we feed for example places like flemish again where i am in the lack of regard and concern
for vulnerable people has led to you like a whole some poisoning and thousands tens of thousands of young lee's mostly african american for sure poor so i think that you know in a line of work i think well we have water regulation and air quality regulations we have regulation over drugs and pharmaceuticals or automotive industry it's all kinds of technologies and systems and services and resources natural resources that we're deeply reliant upon that we need to have a high level of integrity and i think it's realistic for us to expect that if we're a more public services and educational services and social services are moving on consumer experiences are moving to the web that is an environment where we should expect a high level of integrity and this is where you many of us are saying that we
need a better regulatory framework we certainly wouldn't let pharmaceutical companies develop drugs and just turn them loose on the public and say let's just see what happens and oh oh those people died we should take better off the market you know and like we'd like you learn about it after people's lives have been taken or devastated and yet we have a whole in the tech sector there's no we're really developing projects and ideas either in our labs we're in garages or in all kinds of places in conference rooms and in getting the venture capital backing and turning them out into the marketplace and then we figure out after the fact oh that didn't work out so well maybe we should've rethought that it greater protections between some of these new so called great ideas and they're being who turned out to the public because one of the things we know is that i'm poor communities communities of color
we need people who are the children of people who are vulnerable in our society are really the data disposable and there'll be people upon which these technologies are practiced and perfect did like american of new cash bail or predictive policing ray or other kinds and services if you really want to get it to the nation how some of that is devastatingly on happening you should read virginia banks work on automating inequality and it will it will really touch feel i think your listeners to kind of understand how foster care systems and you met a local services and distribution of those few poor people are getting caught up in these kind of software based systems technical systems and people's lives are really hanging in the balance ok and then there's just this little thing called the election of twenty sixteen before before we even go there i do you have in your book it was about a study from twenty
thirteen about manipulating search rankings and how beckett shift in voter preferences without them even being aware yes said this is a really important study by epstein and roberts and editor controlled study where they ship it had voters do searches and if on the first page at fate we're exposed to content that was negative about a candidate they said they would not vote for them at a false positive about a candidate they said they would vote for that and they argued ultimately you from the findings of this study and that democracy was really at risk because search engine rankings are so easily minute below that my people don't even know if they are being manipulated and i thought this was a really important study and i talk about a lie and i try to share these in these scholars work with others because the
truth is out there was incredible amount of media manipulation happening during the two thousand sixteen presidential election and we knew we know this they think a lot of the focus has been on facebook and send degree twitter and the way in which misinformation or disinformation around hillary clinton for example circulated but you know what people pay a lot less attention to you were the ways they think that google and its properties i get here were also being used in one of the things that's particularly interesting to me is how people often use googles you fact check so let's say they see something that doesn't need a lot of censored a notch further like there was a us in a stir on conspiracy running out of it keeps a sharp right let's go check that out while they are in their most likely to go to google to fact check that right in thinking that google will be the place to get a sense making of things they find social media so i think that's a
really important thing we need to keep in mind again about how the public will lead to what they find out we know for example from the pew internet research studies that people believe are more than seventy percent of the public believes they use the search engine's believes that what they find there is accurate or trustworthy rating it's credible reliable and you immediately following the two thousand sixteen election i believe as was widely covered the story that if you did a search on our final us presidential election results you were led been very first freight with a disinformation fate that said that donald trump had won the popular vote now you know this isn't just an alternative fact this is a flat out lie and yet here we have half the country who to this day still believes that donald trump won the popular vote i think that these kinds of things we talk about the veracity of information and how can we trust information how
how does information player will receive on politics if you're just trying to make sense of the political environment is very very difficult and people are really reliant upon a surgeon gen and by extension social media i learned schiller and this is an inflection point in talking with the intercity and noble an assistant professor at annenberg school of communication and the university of southern california and author of the book the rhythms of a russian for more rising up stories subscribe to the election when i guess we'll be right back he's a man man
pieces i learned sharon my guest is dr cynthia nobody she's assistant professor emeritus of education at the university of southern california and her book algorithms of oppression so there's this sociologist that you talk about in the book at herbert schiller and he he was born in like nineteen ninety and he's he died probably twenty or so years in iran and he was around in the early part of the twentieth century and way back then and he coined this term package to consciousness yes so her parishioners are really important and a needy as and mutations our many invests her study has worked and you knew he was his work his life's work was really concerned that they have
had their monopolistic corporate control over media environments and how less media would have a huge impact on democracy suu if you look at it his work he was kind of concerned aware of the way and wakes a small number of companies to kind of control that the message is so to speak and this is of course really important because when you study other media scholars and beauty and the history you know for example that more immediate and more voices that is better for democracy and less media are fewer media companies and i would include media tech companies now in that is worse for democracy that and unit because of a narrow band of messages and ied's can permeate into bed you know into the consciousness of the people and
certainly my you're the apple alicia had been slaves men of african people for example was bolstered by the fact that we had at one time in over two hundred thousand newsletters pamphlets and newspapers that circulated and abolitionists were incredibly reliant upon being able to circulate a lot of i x stories and ideas about how you know perilous and immoral enslavement of other human beings it is and it was and so you know robert shiller to me is really important because he's been an important voice for crime speaking a ganz a monopolistic media control and that i reference in because i certainly think of google as having an event a monopoly on search and being a very important primary for stepping gateway into the information ecosystem online will you will you offer an ask you to
describe this visualization that you're offering up as an alternative way of viewing so search results is that it was on it was more a question you know cause i used to talk about it so here people off and say here you know everything you're talking about is a downer and oppression is a downer cow wildly anyway yeah i mean you know it's like we're laughing as a defense mechanism ok so you buckle my parents were artists and say and the metaphor is that i a badger are in ethiopia my own life are often from bar world and so one of the things they said is you and i can hardly far as the eye it's really not fair to call this a solution because i don't really think it's a solution asked kind of approach to on the complexities of the things we're talking about workers adjusting answer and if we did that were different technical system and if we did that it would be fine but i do think that you like if we had a really
diverse media escaped so speak and we had multiple ways of thinking about finding information on the internet that would be better then having won richly or so i used the metaphor of that color picker to all which many of us know it's combat color wheel where all the colors can blend into each other and were trained at your users often can you know we were looking for like a pretty color for a sponsor or objects you know i use this metaphor to say well what is our interface to be information on the internet why is access to something that looks more like a color picker talk where if i put my search box and the red portion of their color pickers for both of the color wheel i kind of you know that's the rabbi gets district so to speak of the web and i know what's coming coming there and maybe the drag it to blue it's the government information or its academic studies or his research means a dragnet search box to the green part i knew that
there would be projects where things for sale where your kind of a more commercial environment and maybe we couldn't you want this so bad we could make sense of things that are overlapping and me multiple interest when trying to do with the metaphor in the book is describe like what made me if we had something other than a ranking system from one to a million or more as a way to make sense of the information we find online and part of that's because we're in a cultural context certainly in the us and in the west where it to be number one is to be the best so if you go to a page and you're in and you're experiencing information in a recorder fashion there's already a cultural context for what number one and being the best you know we don't have a phony finger at a football game that says one million three hundred million and we're doing i would do a number one answer number one is said is something we might want to disrupt on right out of
the gate the other thing i say that's really important in this series we need and non commercial way of accessing information and this is where i am so grateful to buy variance in particular who'd been carrying and sent sneaky and organizing knowledge for thousands and thousands of years for us and we are whether it's oral forms of knowledge or print knowledge and they have done the work of kind of helping organize an and not necessarily for a profit motive in the same way that an advertising platform like a commercial search engines doing and so i i've been trying to raise money and if there are any foundations and funders are interested in supporting the librarians academic in public librarians in and tackling media center the indexing of the open where i think that would be really viable and valuables you our democracy we just many many
ways of cutting and rather than having a painful platforms kennedy served as the gatekeepers and to me that is so it's not that we need to build a go away we just need any media thousand more five different differently motivated search engines to yeah mad that is i mean that's brilliant and hopefully the antidote to what you know what i hear all the time which is well i don't i try and you know that when i talk with somebody who's an intact are involved with it at another train or agree with google's done you know their rv so far along i'm just going to wrap my systems inside their systems and so what i hear them say that and i am and then i read your book i talk to you i'm thinking you're number and holy at mail call me the curse you know it was just a bed talking with my students this morning and class about microsoft and remembering that you know like people that we turn in their hair their
assignments and apples and in in pages and then you know the teachers couldn't open and they're like painters what is as i can of the message sent it to me in a microphone or dark i can't i do we're now you know a pdf as finely bred to grow there there is something about the way bad so much capital an investment is needed by large institutions into only one she's data yet whether its software or a new interfaces hardware that's very difficult to break in and he asked all of their professors who want work on a map of pro and it votes refuse it because they know that as you are to support a pc burnett an american turnout rates in these kinds of decisions are harder or at least you can have a shrinking of choices and any happen at large current institutional models quite frankly so i
don't think that these traces really get always be made by individuals i think it's really a bounce on how public institutions and commercial institutions decide to kind of open up new possibilities in of course we need many many more noncommercial public interest types of technologies i want to ask you about we haven't you know this is a show about how women eyes out we haven't really focused on how this applies to that but you do have you do write in your book about taking a black feminist technology studies approach and i wanted to ask you to you to talk more about what that means well you know in the book i have to say that meet him the black feminist technology studies approach to thinking about and you know not just google but a whole host of art forms of technologies and you know what i mean by that is that like finance and activists
scholars community organizers mother's aunt i see pl have really bad about a century those who are the most vulnerable now in many cases those are also other african americans or black people but there's this idea of thinking about how do we send her people who are the most vulnerable in our society because you know when they raise everyone rises wang yang eradicate poverty when you know eradicate needed discrimination and and systems that hold these kinds of your second and third class types of season show our belonging in place and we saw black women hired you were born abroad both racist and sexist systems simultaneously and
how also been on the forefront organizing for civil rights for human rights for women's rights and so i basically i'm saying in you it's those traditions and from those you order you innovative an important and just and humane and fair or ideas have come to existence in our society and on and i do believe that i think history media bears that out so what would mean if we sent heard these ideas about the most vulnerable the most precarious and we designed our technologies with those things in mind or what it would mean for example would be that when we're making hardware like all the electronics that we use and especially those that are reliant upon any kind of microprocessors show that we would be mindful and we waited nine engaging expletive i extract give mining relationships
so we would design microprocessor chance that didn't foment and you know the abuse of people who do that kind of labor and they're expletives and we also would design technologies that i had for example when we're done with them and appellate trials were done that without don't get loaded up on a barge and ship to china where ships to west africa to quiz in their environment i used to being dumped in to teach new toxic e waste cities we would design differently if those things were in our front yard here and if you added up all of that in palo alto chesney there'd be a totally different set of design imperatives that the biodegradable you might be even really your phone when he's done with that i don't know there's a different set of values help us rethink and re imagine and quite frankly we
really must do this because with the tech industry is all of his alleged ties to the mining they use their story and all of the dhs you know that's supposedly macleod is actually a huge server farms and the environmental degradation and that is happening from our the ways that we live as human beings now is at the new it's indescribable it's it's another it's another day for you know talking about terrible things that you know global warming and all of these kind of environmental disasters the tech industries are implicated in that and so this is why i say you know what if we designed with a different set of values and i think that but the message given some amazing values and we have we've seen a lot of levity from human suffering and a denial of rights and participation in society that that is the pledge from nasa really helped us and we imagine in this world so i think that's a great place to go for new thinking on that kid
i mean we did see a black feminist alcon valley and you know i'm here for that and that i think they have a lot of fresh new ideas and a lot of really different kinds of ideas and more we're living in a moment where the keys to our portals of knowledge are in the hands of the rich and powerful people who have the least to gain and the most to lose from changing the system we live in today artful design systems that profit from click the credible information human attention grabbing this information is spread then you have the seeds of division daughter sophia noble has a vision of writing this profit driven order of things she sees a world where we design systems around the most vulnerable so we all rise instead of serving only the most powerful dr cynthia
nobody's the other as other rhythms of oppression how search engines to reinforce a politician it's an infection free radio dot org so the sooner we understand how to find and share reliable sources of information and the sooner we can and the algorithms of aggression and this is how we rise at this is inflection point i'm lauren shuler that's our inflection point for today oliver episodes aren't as media outlets detour and hear one is a five star review and subscribe no one in the concrete rising up story led a radio
or they're invaders or the viewers of the news stories from the media we're on the radio and follow me on twitter and belly shell oil to find out more about the guest speaker today to sign up for a no inflection point radio dot org or our first story editor and current managers apparently there or engineer the details are
Series
Inflection Point with Lauren Schiller
Episode Number
#103
Episode
How Search Engines Reinforce Racism & Sexism - Dr. Safiya Umoja Noble
Producing Organization
Inflection Point with Lauren Schiller
Contributing Organization
Inflection Point with Lauren Schiller (San Francisco, California)
AAPB ID
cpb-aacip-fac43231ccb
If you have more information about this item than what is given here, or if you have concerns about this record, we want to know! Contact us, indicating the AAPB ID (cpb-aacip-fac43231ccb).
Description
Episode Description
What we think of as “the great equalizer”-- the internet--and the algorithms behind it, created by humans, is serving up content that is sexist and racist and biased. That discovery was the beginning of an investigation that eventually became a book, “Algorithms of Oppression: How Search Engines Reinforce Racism.” In this episode of Inflection Point, Lauren Schiller talks with the author and assistant professor at Annenberg School for Communication at USC, Dr. Safiya Noble.
Broadcast Date
2018-11-05
Asset type
Episode
Genres
Talk Show
Topics
Technology
Women
Subjects
AI; Machine Learning
Media type
Sound
Duration
00:54:24:01
Embed Code
Copy and paste this HTML to include AAPB content on your blog or webpage.
Credits
:
:
:
Guest: Noble, Safiya
Host: Schiller, Lauren
Producing Organization: Inflection Point with Lauren Schiller
AAPB Contributor Holdings
Inflection Point with Lauren Schiller
Identifier: cpb-aacip-1c7795a4d58 (Filename)
Format: Hard Drive
If you have a copy of this asset and would like us to add it to our catalog, please contact us.
Citations
Chicago: “Inflection Point with Lauren Schiller; #103; How Search Engines Reinforce Racism & Sexism - Dr. Safiya Umoja Noble,” 2018-11-05, Inflection Point with Lauren Schiller, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC, accessed September 20, 2024, http://americanarchive.org/catalog/cpb-aacip-fac43231ccb.
MLA: “Inflection Point with Lauren Schiller; #103; How Search Engines Reinforce Racism & Sexism - Dr. Safiya Umoja Noble.” 2018-11-05. Inflection Point with Lauren Schiller, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Web. September 20, 2024. <http://americanarchive.org/catalog/cpb-aacip-fac43231ccb>.
APA: Inflection Point with Lauren Schiller; #103; How Search Engines Reinforce Racism & Sexism - Dr. Safiya Umoja Noble. Boston, MA: Inflection Point with Lauren Schiller, American Archive of Public Broadcasting (GBH and the Library of Congress), Boston, MA and Washington, DC. Retrieved from http://americanarchive.org/catalog/cpb-aacip-fac43231ccb