Big Tech Is Playing Us
How do tech giants like Amazon, Google, and Apple shape and exploit racial dynamics for profit? Our guest, Julia Rhodes Davis, joins hosts Solana Rice and Jeremie Greer to uncover the hidden ways tech giants influence our society and exploit black and brown communities.
Episode 20: Big Tech Is Playing Us
References and resources
People really living in very different worlds. There is a reason why the majority of people who are stocking bunkers and buying up land in resource rich regions are white, male tech CEOs and shareholders.
Solana Rice:
Hello. Welcome to Racism Is Profitable, a podcast about race and the economy. I’m Solana Rice, co-founder and co-executive director of Liberation in a Generation Action. And I’m with my co-founder, co-executive director, Jeremie Greer.
Jeremie Greer:
What’s up?
Solana Rice:
Well, Jeremie, it is the holiday season and this time of year at least domestically, it’s gift giving season. It’s a national pastime. Did you get the gift? Did you buy the gift? Did you like the gift? Where are you at on this?
Jeremie Greer:
I hate that. I hate the mall. I hate the mall. I hate going to the shopping mall. You know me. I’m a naturally introvert. I don’t love to be around a lot of people. So when I do go to the mall, it’s like at 2:00 on a Wednesday in the middle of March or February when no one’s there. A Saturday at the mall during this time of season, I want nothing to do with that.
Solana Rice:
Yeah. So just stay away. So does that mean you’re just doing all your shopping online?
Jeremie Greer:
Which is still not easy because through our work and our many partners, I’ve come to learn … And through the work that we do together, this podcast, how racial capitalism leaks its way into everything we do so you have to be discerning. Amazon. Okay. I use Amazon. Okay. All right. Don’t want to go … Oh, I already knew about Walmart. They make it so easy. So easy. Because an introvert like me, don’t want to go to the mall.
Solana Rice:
You don’t have to interact with anybody.
Jeremie Greer:
I don’t know what to do. I’ll get it done. I will venture to the shopping mall eventually, but this is a part of the season I just do not like.
Solana Rice:
Oh my goodness. Well, it’s-
Jeremie Greer:
Which was made … Go ahead.
Solana Rice:
It’s getting late in the season. You better get on that.
Jeremie Greer:
This is every year. I’m practiced. I’m well practiced in this last minute rush for gifts. But I read this article that really further solidifies my feelings about Amazon in The Guardian. The Guardian did an article recently where … It was December 18th. And they highlighted a little campaign at Amazon in which Amazon sent a letter to all of its employees and asked them to tell Peccy. They asked them to tell Peccy who is their … Well, they call it their mascot and cultural ambassador. I didn’t know Amazon had a mascot until I read this article and then I looked it up and I saw it. It’s like a orange blob. This weird looking … I can’t even tell you what it is. It’s like a little-
Solana Rice:
It’s kind of like a minion.
Jeremie Greer:
Yeah, like a minion. Yeah. Or like the orange ghost on Pacman. That type of thing. I guess it’s cute. I didn’t know it existed until I saw this. But they said, ask Peccy. Tell Peccy your hardships. And this is going out to their workers. Tell Peccy about your hardships and maybe Peccy will do something special for you during the holiday. So basically ask Peccy, this little orange minion, for help. Amazon’s telling their employees that if they’re having a rough season, to ask this little orange character for help.
Solana Rice:
Meanwhile, the workers have said what they need. The workers said, pay us more money. We do really hard work, so just pay us what we deserve.
Jeremie Greer:
Just pay us. Just pay us. There’s a quote from Keith Williams who’s a worker at the warehouse that this was distributed in, and he said, “They basically say, ‘Tell us about your hardships, how hard this season is for you, and if we feel sad enough, we’ll give your family some help.'” That’s basically what they said. And again, the employees have said what they want. They want more pay. They need to make more than $17 an hour, which is the entry level at Amazon. Reports have said that in the places that Amazon workers are, should be $28 an hour for the cost of living in these places. And at the same time, this is a great season for Amazon. They’ve tripled their profits in the most recent quarter to $9.9 billion. And rather than to just pay their employees, they want them to bring their hardships to this orange blob.
Solana Rice:
This orange blob. It’s like an orange Santa Claus maybe. I got really in this idea of-
Jeremie Greer:
Is that Trump? Is that what Trump’s supposed to be? The orange Santa Claus? What?
Solana Rice:
I got really interested in this idea of the mascot so when you sent me that article, I did a little digging and I was like, what is this Peccy?
Jeremie Greer:
Oh, you did some research on Peccy or whatever the fuck it’s called. Peccy.
Solana Rice:
Yeah. I don’t know. I guess we’re saying it right. It’s P-E-C-C-Y. And the folks over at Amazon say … The representatives say that they love Peccy. He’s called Peccy because he represents our peculiar ways. We call ourselves at Amazon very peculiar. Another Amazon PR representative that … I guess around the world, Peccy looks different. Sometimes they’ll have glasses, sometimes they’ll have [inaudible 00:07:09]. Sometimes a hat.
Jeremie Greer:
Like Mickey? Kind of like Mickey? Just dress up for the country they’re in? Okay.
Solana Rice:
Yeah. You just never know. And the PR representative says he’s really a master of disguise. Which I think is so apropos for Amazon because if we’ll recall, it was a book company and now we can’t do anything without running into Amazon. We can’t do a query, we can’t go on the internet. We can’t look at TikTok. So The Guardian had another article about TikTok apparently also scrubs their data and all their content for things that might harm us. So they have thousands of people all over the world that look at content all day, all night. And this was the unofficial policy. TikTok said, “Well, if you come across something with Amazon on it, don’t necessarily block it because obviously they’re one of our biggest advertisers and so we wouldn’t want to just scrub off content from Amazon.” And so of course TikTok is like, “This is not an official policy. This is not really what we’re about. We’re about protecting the folks that view our content.” Of course, The Guardian says otherwise. And I’m pretty sure our experience of Amazon and other big tech firms tells us otherwise, and we really have to wonder like, oh, are all these companies actually working together to give us very little information or the information that we do have is for profit.
Jeremie Greer:
One might call that collusion, which is an illegal act in accordance with US antitrust laws. Just saying. But it turns out Peccy is also a TikTok moderator, and it approves what goes on through TikTok apparently.
Solana Rice:
His little googly eyes being like, nope, that’s just fine. Peccy says okay.
Jeremie Greer:
He’s like porn, violence, Amazon. Okay.
Solana Rice:
Okay. Okay. Okay.
Jeremie Greer:
Yeah. No. And then in all serious, it is totally aligned with Amazon’s strategy of controlling information, which we know is important to them as Jeff Bezos owns the Washington Post, one of the largest print newspapers in the country. So in order to expand your monopoly, one of the strategies is to control the means that we as people out in the world get our information. And TikTok is on the level of that type of platform, and it’s definitely why they … And of course, TikTok’s behaving this way because Amazon is a huge investor in TikTok, which is a Chinese owned company, which isn’t a problem in and of itself, but it is a large corporation that is a monopoly and Amazon’s way of controlling it is to hold ownership stake in that company.
Solana Rice:
So we explore not only Amazon, but a lot of other tech companies and just the general tech business model with Julia Rhodes Davis, who is our guest on the pod today. Julia Rhodes Davis is a leader and expert at the intersection of technology, democracy, and social justice. She’s the lead author of Advancing Racial Equity Through Technology Policy, and we are so delighted to have her stay tuned.
Welcome to Racism is Profitable, a podcast where we talk about the intersections of race and the economy. We are so pleased to have a great guest on today. You may recall, or you may not, that Liberation in a Generation released a report several months ago about the role of large corporations in perpetuating the suppression economy that we live in. And so today what we want to talk about are the dimensions of oppression that we see with corporations. We’ll dig specifically into big tech though. We want to see how those dimensions come to life. And so today we’ll be talking to Julia Rhodes Davis, co-author of Advancing Racial Equity Through Technology Policy. And we really want to hear about how tech giants like Amazon and Google and Apple and Microsoft, how do they make racism profitable? We’re going to get under the hood a little bit and dive into what to do about these corporations in particular and big tech generally. Welcome, Julia.
Julia Davis:
Thank you so much. Really delighted to be here.
Solana Rice:
Yeah. Let’s start really easy. What brought you to write this report?
Julia Davis:
Great question. I think that we’ve seen an increasing concern about the ways in which big tech is influencing our democracy and our economy. ChatGPT has become a table side conversation in many households, and yet racial equity is really not a central part of the conversation. And I think that in order for us to ensure that we have futures that we are longing for, certainly that you all are helping to shape at Liberation in a Generation, we really need to center the concerns of black and brown communities as it pertains to how technology policy shapes our lives.
I also just think that we are seeing momentum build on Capitol Hill around legislation and regulation of artificial intelligence systems. And so there’s also a timeliness of this conversation. And I want to make sure that the laws that are written hopefully in congress in the next year or so, actually really center racial equity concerns because these technology systems are going to encode values I think even in a deeper way into our daily life. And so I want to make sure that this paper is part of a broad effort to center racial equity concerns. And then there’s also a more personal answer for me, which is that I personally seek to repair harms and also honor the legacy of justice seekers in my own lineage. And so this is an offering, a personal offering to that as well.
Solana Rice:
That’s inspiring I think in part because technology makes our lives so easy, but it does come at a cost and it comes at big costs for people of color and often costs that we can’t necessarily get out of. And we might not even know that we’re paying them right? And it was part of the reason why we started writing our paper as well. Not only just … Our paper is more about corporations generally, but in particular, tech is one of the reasons why we started getting into this work because of the intersections with our justice system, with our criminal legal system. The surveillance that happens, the reporting that happens across agencies we found really disturbing at the very least. And so I’m curious about how did you find folks to write about this with you?
Julia Davis:
Right on. Great question. I along with my co-authors started with a literature review even trying to understand where is racial equity or where are racial equity concerns showing up in tech policy, tech and society debates and really identified some pretty significant gaps in terms of understanding a bigger picture. So yes, absolutely there’s a lot written and a lot of important work happening around the use of technology in the carceral system and over surveillance of communities of color by police. And then there are separately concerns about labor and workers. But really there’s not a comprehensive view of the big tech business model. And so we put out a call. My co-authors and I put out a call for white papers from academics and experts who are really working on these issues to help us create a framework for understanding how racial equity is implicated by big tech.
And so we got an overwhelming response from folks. And so we ended up with a handful of original research papers from scholars like Nicol Turner Lee out of the Brookings Institution, Ruha Benjamin and Sierra Robson from Princeton University, Shelly Stewart at the Aspen Institute, Fallon Wilson with the Black Tech Futures Research Institute. Veena Dubal at UC California Irvine. Ulises A. Mejias at SUNY Oswego. Amina Kirk and Mae Watson Grote at Change Machine. And Jasmine McNealy at the University of Florida, as well as Sarah Meyers West from AI Now Institute.
And so they really helped us understand different dimensions of racial equity and how they’re implicated by big tech. They helped us inform a view for what the business model really is and helped us understand that though each company, Google versus Facebook and Meta versus Amazon and others, they want us to believe that they all operate distinctly from one another, they don’t have shared business models, in fact, they really do share some pretty key attributes that look very similar. Maybe in different measures, but they’re all operating with a set of practices that inform their business model and ultimately lead to their consolidation of power. So thanks to all of those scholars, and actually to our advisory committee who’s named in the publication on which you sat, Solana, to make it happen.
Solana Rice:
Yeah. It was a pleasure. It was a real pleasure. What I really … And you’ve started to allude to this. That there is a shared business model. There are these five … You call them five interlocking ways that companies make money. That these tech firms in particular make money. One is owning a lot of cloud storage, software applications, machine learning. Today I had to do a whole lot on my Mac. I couldn’t do it at all without my iPhone. It was just like I had to be all immersed in all Apple products all the time to get it done. Another is advertising. Can you share more about the other three ways that these companies really consolidate their power? So basically looking at how they can control the market and the marketplace? How they influence regulations that should be regulating and policying them themselves. That’s a new word, policying. It’s a term of art. And labor capture. Could you say more about those other three?
Julia Davis:
Yeah. Market dominance really operates in two ways. One is the ways in which big tech companies are buying up all the competition. So they’ll look at the field of startups who are … Maybe they’re trying to get into search, maybe they’re trying to get into E marketplaces. Maybe they’re a really niche product but it’s something that say Google has spent a lot of money on R&D around. These bigger players will buy up these startups and either take the IP if they see it as valuable, the technology advancements that they are developing in these startups and import them into their own products and then kill off the startup entirely or they’ll just buy them up so that they’re not a problem anymore. They’re not a potential competitor in the landscape. And they have so much capital, so much money to make these acquisitions. I think in 2021, big tech spent over $260 billion essentially buying up and killing startups one way or another. So that’s one way it operates.
And then the other way it operates, big tech often participates in the markets that they control. So Amazon for example, they often have a talking point that they’re a wonderful platform for mom and pop companies and mom and pop stores to sell their wares on, but in fact, they’re mining which products are doing well on Amazon that are being sold by third parties. And oftentimes they’ll produce their own and then through their control of the marketplace, set pricing and visibility for that product above the mom and pop who was the inspiration for that product in the first place. So that’s how market dominance works.
Regulatory influence shows up in a couple of ways as well. Lobbying is nothing new. Lobbying and corporate communications. But big tech spends tens if not hundreds of millions of dollars, not only lobbying policymakers and lawmakers about how tech should or should not be regulated, but they’re also seeding a mainstream narrative that tech is so complicated that it can’t possibly be regulated. You could never possibly understand it well enough to regulate it. And so if you’re going to do anything at all, you need to ask us, big tech and our CEOs and our heads of research and so forth and so on, if you’re going to intervene at all because you couldn’t possibly understand it on its own.
And then also, cloud computing, as you mentioned, has really grown exponentially over the last few years. And one of the biggest customers of cloud computing products, as boring as it is, is federal, state and local government. And so when you become so dependent on technology providers for the day-to-day operations of your business as you were just describing, it’s very hard then to also be able to intervene with policy that’s going to create more friction for those operators. So there’s an inherent conflict of interest. And then I left labor for last because I think it’s just so critical. Of all the harms, and we can talk more about this in our conversation, this is ground zero of people’s lives and people’s livelihood. And so yes, these other harms and other ways big tech controls power is really important to understand to have a holistic picture.
I think in terms of just being responsive to the humanity of people’s lives and experiences, we really need to hold the ways in which labor capture is such a big part of the big tech business model. And obviously this has its roots in the earliest days of racialized capitalism in the world. Extracting value off of the backs of human beings is something that America pioneered and has been really good at from the very beginning. But what’s distinctive about big tech is something that we talk about in the paper, invisibilized labor. And so basically what that means is big tech is intentionally making workers and especially low wage workers less visible to management, consumers, policymakers and each other. And that really diminishes the ability for workers to intervene on their own behalf and for others and advocates and policymakers to intervene on their behalf as well.
And so by invisiblizing labor tech companies are maximizing efficiency and lowering labor costs while the workers themselves are faced with highly constrained, precarious, and isolated labor conditions, which leads to stories that we’ve heard thanks to the organizing of folks like Christian Smalls in New York and others. We really understand that people are not able to take a bathroom break or are working through injury because they don’t have a human being to go tell that they were injured and if they stop their job, they lose their job. So these are the kinds of examples that really I think we need to understand are affecting folks and affecting low wage workers, which are disproportionately black and brown folks at a much greater rate.
Solana Rice:
Yeah. I’m curious if you could say a bit about what kind of jobs these are. Because sometimes I even have trouble imagining … Of course there’s the Amazon warehouse workers. But are there other entry level jobs that are impacted or invisibilized, as you said?
Julia Davis:
Yeah. I think that there are a few different categories here. One would be folks who are doing data labeling. And these are people who potentially are coding certain types of imagery that are then used in generative AI systems. And there’s no protection for these folks against being exposed over many hours at a time to really egregious or violent content. Same goes for those who are labeling data to inform automated content moderation. So oftentimes social platforms will say, “We are increasing our ability to respond to online harassment or to bad actors on our platforms by increasing our ability to automate our responses to that.” But that capacity to automate is being built on data sets that are being labeled by human beings. And when you’re talking about violent white nationalist content, for example, what does it do to your mental health to be exposed to that type of content for dozens of hours on a weekly basis? So that’s an example.
Another example are rideshare drivers who very often their take home pay from each ride changes ride to ride, day to day, conditions to conditions in ways that they don’t have any ability to anticipate. And I think in all of these cases, including Amazon warehouse workers, the majority of these workers are “contract workers”. So you talked about entry level, Solana, and when we think of the entry level jobs that my parents’ generation had available to them, those came with healthcare and employment protections, and these come with none of that. So there’s also just the baseline precarity that these workers are facing in addition to the really challenging environments that they’re working in that create incredible harm.
Solana Rice:
Yeah. It just strikes me that across all of these ways that racism is profitable inside of these tech companies, we have a range of ways. It’s devastating. The idea that not only it’s intellectual property as you talked about. Like our ideas are being consumed and used. Our identities are being consumed and used on the data side for for-profit. Our entrepreneurial spirit, our physical and mental wellbeing is at stake. I’m curious amongst all of this … It’s a lot. What are some of the most egregious implications, particularly for black and brown people, whether we be consumers or business owners or just everyday folk trying to live our lives?
Julia Davis:
Yeah. It’s really hard to say which one’s more egregious, because at the end of each example, there are human beings whose lives are deeply impacted. And I think three stand out. One of which is the ways in which AI or algorithmic decision systems, which is just a wonkier way of saying AI essentially, are used to make decisions about public assistance access. So you’re talking about a group of folks who are already living in the precarity of poverty, and then they are facing lack of access to healthcare, lack of access to food, lack of access to housing all because we know that algorithmic systems are discriminatory and inherently inaccurate and oftentimes are deployed in these public assistance contexts in ways where people who are denied access are not able to understand why they were denied or to have timely recourse. So imagine you are there to collect your food stamps for the week, you’re told that you’re denied automatically in some automated way. There’s no human for you to go talk to that day. And so what do you do about feeding your family? So that’s one.
Two, I think talking about labor capture as we just did. I just really want to continue to underscore that because of the disparity of representation of black and brown workers in the white collar jobs of tech and the over representation of black and brown workers in the low wage jobs in tech, the ways in which the labor dynamics and the extractive nature of racialized capitalism plays out in these contexts is just really profound. And then the last thing, which really doesn’t get nearly enough airtime, and in hindsight I wish we had put even more emphasis on it in the paper, but there are tremendous environmental justice impacts of AI systems. So to process the amount of data that is powering AI systems, Google and Meta and Amazon, all own data centers. Microsoft. All over the country. And the site of these data centers are often in resource and water scarce geographies that often house poor folks, often disproportionately black and brown. And then generative AI really blew up after we published the paper, but just to give a really concrete example. One query, one back and forth between you and the ChatGPT bot takes half a liter of water to produce.
So yeah, it’s staggering. The real world environmental impacts of these technology systems that are often represented as existing in the cloud as being virtual, as not having real world tangible IRL impacts, but in fact they do and they’re significant. And at a time of climate crisis, I really think that it’s critical that these environmental justice issues get brought to the front of legislation that can protect folks from losing access to critical resources.
Solana Rice:
Yeah. Thank you so much for remembering to offer that because I certainly think of all of this technology as being completely virtual and up here in the cloud, and it’s fine. It doesn’t have any physical repercussions. And the idea that one quarry is probably just as much as an almond. The idea of producing an almond or producing an avocado that actually feeds us and provides us nutrition is really super important. And as you said, black and brown folks are usually the first hit in these climate crises moments and have sustained environmental justice impacts. We just have to remember that everything that we produce in this nation, in this world is based off of something physical, is based off of our land and our natural resources. So thanks for bringing that back. And I’m just curious about what do you imagine if we do nothing? If we just keep doing what we’re doing and just letting the big tech companies grow and do what they do?
Julia Davis:
I first want to say that there are a lot of people working really hard to make sure that we don’t cede the future to those interests. So let’s just name that that is happening and that-
Solana Rice:
Thank goodness. Thank y’all. Thank all of y’all out there.
Julia Davis:
And honestly, Octavia Butler’s Parable of the Sower comes to mind. Real talk. I really see dramatically wider economic and social inequality. I see an earth stripped of her natural resources. I see ever-increasing political polarization and violence and people really living in very different worlds. There is a reason why the majority of people who are stocking bunkers and buying up land in resource rich regions are white male tech CEOs and shareholders.
Solana Rice:
So there are lots of people doing lots of things. The Federal Trade Commission, for example, just waged suits against Amazon and Google. Of course, we probably both think that those suits could have a more specific racial equity lens in them. The FTC is mostly concerned with market capture. But I’m curious about when you were writing this report who you thought should read it, what should they do? Should we just be following the Federal Trade Commission and just hopes and dreams that it comes to something? What do you see as important for folks to be doing?
Julia Davis:
I think that there are a few answers to this question. One is I think we need to hold the important work of harm reduction and mitigation, ala the FTCs intervention and other lawsuits that are coming down the pike and addressing the harms and the here and now, and at the same time, I think we also need to be cultivating visions for how we want technology to show up in our lives and to start asserting that this is what we do want and not just this is what we don’t want. Because I think part of the way the dominant system continues to keep us wrapped up in it is we don’t spend enough time focused on the futures that we’re longing for. And so I would just say at a metal level, we must push back in the existing system and we need to be dreaming about what technology can do for us and how we want it to serve us. So that’s a big picture response.
I’ve been really heartened by the recent Biden administration’s executive order on AI in the category of harm reduction. Which at least addresses how AI systems are implemented in government. Some of the concerns that I raised in the earlier part of this conversation and that are raised in the report, hopefully can be addressed by the guidance between the executive order and the Office of Management and Budget that basically dictate the ways in which people’s rights need to be protected through the implementation of AI systems and government agencies. So that’s a promising light and it’s not enough. We really do need laws that get at the heart of these issues.
And I will say that I was really inspired by the work that Dr. Alondra Nelson led at the Office of Science and Technology Policy at the White House last year that resulted in the AI Bill of Rights. And I often say the means by which we produce something are the outcomes that we produce in the world. And what Alondra and her team did is that they had a very robust public participation process in devising this blueprint. And I really think it’s important for everyday people, for frontline communities, for advocates to have not only a seat at the table, but actually be centered in how legislation and policy is developed. And so I think there’s actually a really beautiful roadmap. It’s literally in the appendix of the blueprint for an AI Bill of Rights, how they did it. And so my hope is that legislators and policymakers really make use of that and replicate that in efforts moving forward.
In terms of who should read the report and what folks should be doing, I’m really hoping this report reaches at least two audiences. One is people working in tech policy who often really don’t engage with issues of racial equity in communities and organizations on the front lines of the most urgent racial justice issues, environmental justice, democracy and voting, housing, et cetera. I think that it is on tech policy folks to do the work of understanding where racial equity needs to be centered more clearly in what they’re working towards. And similarly … Or rather, one way to do that is also protect policy folks to build bridges with frontline communities and advocates.
And then I really do hope that this report reaches folks who are working on those frontline fights. Who are fighting for water access and fighting for environmental justice in their communities and fighting for equitable access to the ballot box and so forth, which is to say, I hope this is a resource for you all to understand how AI intersects and tech policy intersects with the things that you care most about and that you’re working on most readily, and that this can be a resource to y’all to see where we can build coalitions towards the futures that we are hoping for.
Solana Rice:
Well, Julia Rhodes Davis, thank you for joining the Racism is Profitable Pod. Thank you for your offering and this vision of what is possible. You can find Julia at jrdavis.me. Jrdavis.me. And you can find the report there and in our show notes along with some other great resources. Thank you, Julia.
Julia Davis:
Thank you so much. It was really wonderful to be here today.
Solana Rice:
Thanks for listening. For more information, check out our list of episode resources and visit us at liberationinagenerationaction.org. Shout out to our producer and audio editor, Nino Fernandez, the design team at Trimtab and the Lib Gen Action communications team. Like what you heard? Help us make some noise by telling two friends about the Racism is Profitable podcast. Until next time, y’all, peace.