ALISON BEARD: Welcome to the HBR IdeaCast from Harvard Business Review. I’m Alison Beard.
In the past decade, the big five tech companies, Facebook, Amazon, Apple, Microsoft, and Google/Alphabet, have extended their reach and revenues in amazing ways. They’ve brought us lots of useful products and services, and they dominate various segments of their industry. They’re also great businesses. In 2020, they collectively earned income of nearly $200 billion.
But these tech giants and their leaders are also facing a lot of criticism for the negative impact they have on society, for the misinformation and vitriol spread online, for invading our privacy, for quashing competition, and for avoiding taxes in a way that allows them to pile up cash while a lot of the people whose personal data they profit from are struggling.
Is it possible to keep the good that big tech and all the smaller companies in the industry have created while getting rid of the bad?
Our guest today has some ideas. Mehran Sahami is a professor at Stanford and a former Google employee. Along with Stanford colleagues, Rob Reich and Jeremy Weinstein, he’s the author of System Error: Where Big Tech Went Wrong and How We Can Reboot. Mehran, thanks so much for speaking with me.
MEHRAN SAHAMI: Thanks for having me. It’s a pleasure to be here.
ALISON BEARD: So the first question is pretty obvious, where exactly did big tech go wrong? Facebook used to be a connector of people, now it’s a killer of democracy. Google was a search engine, now it’s a privacy invader. Amazon, a shopping platform that’s busting unions and small businesses. So how did we get here?
MEHRAN SAHAMI: That’s a great question, and part of the way we have to think about how we got here is the mindset of the people who’ve created these products. Now, if you think of the technology mindset, it’s oftentimes around quantitative metrics and around optimization. We like to refer to it as the optimization mindset. The idea is setting particular metrics that you want in your business that you want to try to optimize at scale. And so if you think about something like Facebook, what they want to do is they want to create connection, but how do they actually measure that? What they have is a proxy for connection, something like how often people engage on the site, how much time they spend there, how many pieces of content they click on. But clicking on something isn’t really connection. It’s a proxy for it. And if you take that at scale and you try to optimize it, what happens is you actually get externalities that are not what you wanted.
So you promote pieces of content that people are more likely to click on. Those might be pieces of content that are actually more titillating or more click-baity than, say, truthful content. And so, as a result, there might be a greater amplification of misinformation than truthful information because what it’s doing is maximizing that metric. And you can see this across a number of sites. So, for example, for YouTube, they might want to maximize the amount of time we spend watching videos because they equate the fact that we’re spending our time watching those videos with the fact that we’re happy. But, in fact, you can see the flaw there. Just because we’re watching videos doesn’t mean we’re happy and it ignores other values we might care about. And when we maximize one value like screen time because we’re equating it for happiness, we’re giving up other values that we might actually care about and that are important to society.
ALISON BEARD: And so the issue is that these companies are made up of engineers who are taught to optimize and be efficient. I would argue that people in the financial industry are taught the same thing. They then become the executives leading these companies, the VCs funding these companies, and so there’s no one waving the flag for other kinds of values?
MEHRAN SAHAMI: Well, they are well-meaning people by and large. I don’t think they have negative intent, at least the vast majority of them. But the problem is that most things in life involve a value trade off, and when you’re optimizing and you’re picking one of those values, the other ones are getting left behind. And that’s part of the issue is, how do you actually take a broader look at some of these criteria that they’re optimizing but also think about the fact that the criteria by themselves are just a poor proxy for what we actually care about?
ALISON BEARD: I would argue that lots of industries and companies have this problem. They’re making value trade-offs. They’re doing both good and bad things for society. So why is big tech different? Why are we so focused on big tech?
MEHRAN SAHAMI: Because at this point in time we’re seeing the externalities from big tech on display in full force. So we’re seeing the notion of connection turning into rampant misinformation online. We’re also seeing the platforms take the market power that they have and turn it into political power so that they can continue to maintain the same free regulatory structure that they’ve been under for the past 30 years. And so what we’ve lost in that process are guardrails that bring back the values we might care about as a society as opposed to the values that might be important to the company.
ALISON BEARD: Yeah. So Azeem Azhar, who hosts an HBR podcast and also has a new book out on some of these issues, he argues that our institutions just haven’t been able to keep up with the exponential growth of these companies. So in a way, governments and we the consumers have let all of it happen. Is that fair?
MEHRAN SAHAMI: I think it’s fair from the standpoint of first thinking that the government has basically given big tech a pass. The regulatory structure in the 1990s through things like the Communications Decency Act was set up to give companies pretty broad reign in terms of the way they did business in the United States. You can see that pretty clearly if you were to contrast, for example, the United States and the European Union with the kind of protections they have around data. Right now one of the choices that we give to people in the United States is in the free market we say, “If you don’t like these applications, well, you can just disengage.” There’s the Delete Facebook Movement. You don’t have to use these apps. So what’s wrong with that? You just have this choice. And the analogy I liken that to is consider driving on the road.
The CDC estimates that about one million people are killed annually on the road. So should our choice just be whether you drive or you don’t? If that’s the only choice we have, then we’ve lost a lot of values because you can see that there’s real value in being able to drive despite the fact that it’s dangerous. In the same sense, there’s real value to using these tech platforms despite the fact that they may take our data, they may try to get us to engage more. So what did we do in the case of roads? We didn’t just tell people, “Drive carefully. Good luck.” We created a whole set of safety regulations. There’s stoplights, there’s lanes on roads, there’s speed limits, and so there’s a whole system that makes driving more safe while at the same time we still count on individuals to drive safely.
That’s the kind of regulation that we would call for, for big tech, certain guardrails that prevent certain kinds of practices like being able to have free reign over someone’s data. If we get the values we care about, we can get a safer information super highway while at the same time promoting innovation.
ALISON BEARD: Yeah. But you also in the book advocate for self reform. What are some of the ways that the industry can course correct itself?
MEHRAN SAHAMI: So we look at four main areas. One of them is algorithmic decision-making where we think about the fact that more and more significant decisions in our life, things like whether or not we get credit, whether or not we get a mortgage, who we date, those kinds of things are now determined more by algorithms. And so, one of the things we talk about is the fact that these algorithms, one, may be biased because they are just a reflection of the data that gets put into them and the data that gets put into them is a result generally of previous historical human decision-making that oftentimes reflects bias. But we also talk about, what are the processes by which this can be improved? So you could audit algorithms to see what kinds of bias there might be in outcomes. You could create algorithms that provide for an explanation for why they came up with the results or decisions that they did.
You can also look at things like the distributional effects more broadly to see, is their disparate impact from the decision-making in these algorithms? And you can understand the data that goes into machine learning to think about things like, what is the historical context or embedded social factors that actually influence this data? Some of the other areas we dive into, for example, are data privacy, and we talk about, who has ownership of data? Where’s the transparency where someone can understand what data is being collected about them? How can data be made more portable across sites? We also talk about artificial intelligence, what that will mean in the longer term in terms of potential job displacement, what that means for economic policies to replace support, for example, for the social safety net that we get through employment taxes right now.
Well, if you don’t have employment taxes because someone’s job is replaced by a robot, how do you make up for that shortfall? How do you think about redistributing educational opportunities so that the workforce that is displaced, and those numbers look pretty significant, actually is able to re-engage in a meaningful way in the labor force so you don’t get large-scale unemployment? And, finally, we talk about the power of platforms in terms of what that means at a larger level, the barriers to competitors, thinking about merger and acquisition activity, and the fact that there’s likely going to be greater scrutiny of that as we see now new regulatory structures coming into place as the government takes a more critical look at the platform power and monopolistic power of these large tech players.
ALISON BEARD: And in the book, you also talk bigger picture about getting these engineers to be more in touch with human and societal values. There’s a line where you say you’d like to see a shift toward asking which problems are worth solving and whether some important ones can’t be reduced to a computational solution. So tell me a little bit more about that idea of values led technology.
MEHRAN SAHAMI: Right. So one of the ways you can think about it is, at a very basic level, what are the metrics that someone is trying to optimize in their business? And you can see simple examples of this in everyday life where the choices that actually get made with respect to a particular technology and the result you get from that has a pretty big impact.
ALISON BEARD: Right. Do we need more food delivery apps versus climate change solutions?
MEHRAN SAHAMI: Exactly. And since you mentioned climate change, a simple example of that is when you book your flight somewhere, sometimes you get these very strange flights, you’re flying from San Diego to Seattle and it routes you through Chicago. Why is it doing that? And it’s doing that because the metric it’s probably trying to optimize is the price that you’re paying. It’s not trying to do something like optimize for climate impact because measuring that is actually hard, whereas measuring for price is easy. And so you can see these cases where we just choose what’s easy versus what’s meaningful pushes us in directions that may get us further away from the values we actually care about.
ALISON BEARD: In that case though, you’re pushing for what the companies believe consumers care about and have proven to care about.
MEHRAN SAHAMI: That’s true, but at the same time, there are certain things that consumers care about that they don’t get a choice now. And that’s part of what we’re pushing for is, if we think about what are ways in which consumers don’t have choices? So, for example, takes something like privacy or data portability. If I want to be able to move from one platform to another platform because I like the policies of another platform better, I don’t have that choice right now. And that’s one of the things we can think that regulation can get us to.
ALISON BEARD: You are an educator and you’re working with the tech leaders of the future. When you’re in the classroom with students at Stanford, what do you see that worries you and what do you see that gives you hope for the future of the industry?
MEHRAN SAHAMI: Some of the things that worry me are sometimes the myopic view of what is success. And that’s a broader question, but sometimes success is just equated with things like making a lot of money. And there isn’t a lot of thought given to what are the externalities that are generated by a business? What does it mean for distributional impact among different people so that when you solve a problem for a particular sliver of the population, there’s another part of the population that’s getting ignored? And if you continue to put your emphasis on, say, the affluent portion of the population, because they’re going to pay for particular services or products that you deliver, it means that the non-affluent portion of the population gets ignored by the march of technology. And so that just further exacerbates inequality. But the place I really have hope is I think that students are paying more attention to these issues. They’re more aware of distributional impacts in society, they’re more aware of the inequality that exists, and they have more of an inclination to want to do something about it.
ALISON BEARD: Do you feel that the smaller, newer startups led by some of these younger tech executives are doing a better job than their predecessors?
MEHRAN SAHAMI: I think there’s a mixture, that there is definitely some companies that are taking a more socially conscious approach toward the problems that they’re solving, the kind of solutions they’re trying to reach. One of the issues that comes up is when the marketplace becomes very competitive, how do people re-examine their values? Do they stick to the values that they want to have or do they end up making compromises along the way that push them toward a different set of values because of the competitive landscape? I was at a dinner the other night with a venture capitalist and some of the portfolio companies that they had, and one of the big questions that came up is, when do you decide that you’re going to take on a particular customer or not because you may not think that, that customer’s practices are particularly savory? The competitive landscape might push you toward taking all customers, but that also means that, for example, you could be taking on White supremacist groups. Is that really the kind of group that you want to be supporting on your platform?
ALISON BEARD: Okay. So if I’m a leader of a tech company and I agree that the industry and my company needs to change, at least to get ahead of regulation, what should I do to fix the way my organization works? Bezos or Zuckerberg might not be listening, but you never know, what advice do you give them?
MEHRAN SAHAMI: Right. Step number one, as an educator I have to say this, is to educate yourself, is to find out, what are the actual issues in that industry? That includes things like understanding where regulation may be coming, but at a more fundamental level, understanding what are the dynamics? What are the things that customers really value? And also, what are the values that you believe are really important to promote as a business? And get clarity on that. Once you get clarity on that is taking those values and identifying how they turn into the metrics that you actually want to measure in your business. They may not be easy metrics to measure, but that makes it more meaningful if they’re the things you actually care about. And then, how do you put the right incentives into place that actually get your employees focused on the bigger picture and the larger impacts that you’re having rather than just trying to optimize for one particular metric because their compensation’s entirely tied to it?
ALISON BEARD: What about a lower level manager or even an individual coder employee? Is there anything you would advise them to do from the ground up?
MEHRAN SAHAMI: For the level of the engineer or the coder, it’s an understanding that the choices that they make when writing software actually are value laden. And that’s something that doesn’t get appreciated as fully as it should, that someone is just writing some code to, say, optimize the accuracy of a machine learning algorithm. That’s a pretty common thing to do, it’s something that they’re taught in school, but understanding that when you’re optimizing accuracy, what does that mean? It means if you have a small minority in your population, you can get your data or your predictions wrong on that group and you still do pretty well overall in accuracy. So if that’s the only thing you’re measuring, it’s easy for that group to get a disproportionate negative impact. So as an engineer, in terms of the tools that you have, you need to think about the code that you’re writing, what you’re optimizing for at a granular level, and then, how can you measure what you’re doing to make sure that you’re not actually getting these deleterious effects?
ALISON BEARD: And if I’m outside the industry, this obviously matters to me as a consumer, but is there anything else I can do to promote better values in big tech?
MEHRAN SAHAMI: There are. There’s personal choices that we can make and some of those personal choices might involve things like setting the privacy settings in the applications we use, using incognito mode for searches, because we don’t want them to impact our search history or what information we might get in the future, choosing what are the apps or platforms we choose to engage with and which ones we avoid, cookie preferences that you see everywhere now because of GDPR regulations. But really, at the end of the day, in addition to what we do at a personal level, like I talked about with the driving on the road analogy, we need a combination of both personal choices and regulation to really get the overall picture right.
ALISON BEARD: Right. But, certainly, businesses outside the tech industry, everyone has to use tech now, so if you’re contracting with any of these providers, you can also apply pressure. That might be more meaningful than that which just comes from an Amazon shopper or a Google search user.
MEHRAN SAHAMI: Exactly. So from the standpoint of having more leverage or more scale, if you’re an enterprise that does business with these companies, you can definitely engage with them on what are the values that are important to you because those might also be values that are important to your customers.
ALISON BEARD: Going back to regulation, there are obviously moves toward reigning in big tech. Do you think legislators are doing a good job? And do you think there needs to be more global coordination on all of this?
MEHRAN SAHAMI: Well, I think what we’re seeing now is a policy window open up, so there is more activity around what might happen. What remains to be seen is where we’re actually going to get to in terms of policies. But we’ve actually seen steps taken toward things like more antitrust activity. And part of the difficulty there is the classical framework for antitrust, which is around consumer pricing, doesn’t apply so directly in big tech when a lot of the products are free.
But I think that’s what’s being grappled with right now is, how can you still think about the monopolistic power of some of these big tech platforms even if it’s not a pricing issue? We can also think about whether or not the platforms use their power to promote their own product. So, for example, does Google promote its own phone through its platform? Does Amazon competing against other customers on its platform because it can see what the data stream is and what consumers want and create its own products or own version of products that are the popular products it sees? Those are places where we might want to draw the line to have more of a competitive landscape. And there is legislation, and the work around all of those principles, the real question is, do we actually get there? And that’s what we need to demand from our lawmakers and our regulators.
ALISON BEARD: I’ve also heard the argument that all of this extra regulation actually benefits the big guys because they have the money to jump through the regulatory hoops whereas the smaller companies that might end up competing with them can’t.
MEHRAN SAHAMI: It’s a good point. There’s this notion of a regulatory moat. Can someone, for example, only achieve some of the desired outcomes of regulation when they reach a particular scale? And so you’re favoring the large tech platforms through this regulation. But I think there’s ways to be able to actually carve it out to do it smartly. And so if you look at the California Consumer Protection Act, it takes this tack. It says, “These regulations apply to companies of a certain size as defined by things like the number of customers or the annual revenue.” And so what it lays out is a framework that says, “Look, when you get big enough, these regulations are going to apply to you so you need to be aware of them from the outset when you’re designing your products and building the infrastructure for things like your data collection.”
But early on, we understand that the competitive landscape is different so the regulations don’t apply to you until you achieve scale. Because if you never achieve scale, it’s probably not going to have that big of an impact decidedly, but you need to be prepared for it and understand what the regulations are, even from the outset.
ALISON BEARD: How does this play out in other parts of the world? One country in which these companies are not dominant is China because they have their own mega technology companies. How is the Chinese market developing?
MEHRAN SAHAMI: There, some of the notions of regulation are just very different because it’s not a democratic state in the way that we would think about it, individuals voting for representatives that would then make policies. So there, what we hope to see is that there’s still regulation that provides for individual consumer protection and forward-looking policies that do things like provide educational opportunities for re-skilling when there’s labor displacement as a result of AI. But there’s less of a direct impact we can have on those policies, and so one of the arguments we make is, if we live in a democracy and we have the luxury of being able to exercise the right to see the policies that we’d like to have, that’s what we should be doing.
ALISON BEARD: But aren’t, at the same time, all of these technology companies spending lots of money on political donations and lobbying efforts to make sure that they remain as free as they can?
MEHRAN SAHAMI: Absolutely. One of the things that companies have done that we’ve seen on display in full force the past few years is take market power and turn it into political power through lobbying. And a great example of that is Proposition 22 in California. So what Proposition 22 stated was, it basically had a carve out so that, say, the drivers of Lyft and Uber and the delivery workers of Door Dash would not be classified as employees under California law. They would be considered contractors, and as a result, they would not be eligible for a number of employment benefits. And so the companies, Door Dash, Uber, and Lyft among others did a significant lobbying effort to the tune of over $200 million to get Proposition 22 to pass in California and get this carve out for them. And that sends a pretty clear message not only that they had the political power to do this in California, but that they could mount this kind of challenge across any state that wanted to put in a regulation to classify their workers as employees.
Well, it turns out just a week ago that legislation Proposition 22 was struck down as unconstitutional, which shows that even in the wake of hundreds of millions of dollars of lobbying pressure to try to get regulations passed the way they want we still have a regulatory structure that can essentially overcome that and say, “Actually, there’s other values we care about.” So I’m hopeful, with that as an example, that we can actually get the kinds of outcomes we want despite significant lobbying efforts by big tech.
ALISON BEARD: What do you see as the ideal balance between tech fueled economic growth and consumer and societal welfare going forward? Can we really achieve a sweet spot?
MEHRAN SAHAMI: I think we can, and one of the things that we’ve seen in the last few years is just the extreme wealth inequality that’s been generated through a number of different industries, but big tech being one of them. And so, one of the questions we can ask is, back in the ’50s and the ’60s, we still had innovation, we had a different regulatory structure, we had a different tax structure, but we still got a lot of innovation generated and at the same time we had a society that in some ways at least was more equal, so how do we get to that in a place where we can think about the gains, for example, that could be made by big tech companies being more distributed? And if you think about, say, the Ubers and Lyfts of the world, that’s a great example. Why shouldn’t the profits of those companies be distributed a little bit more equitably among their drivers and other employees?
But when we allow for a regulatory framework that says, “We can concentrate that wealth in the founders or employees of the company and that the workers who make up, say, the driver base basically are getting benefits eliminated, they’re getting wages depressed, that’s where we see the impact of how technology is concentrating wealth, and it doesn’t have to be that way.”
ALISON BEARD: Okay. Well, I hope that we can find the right balance, because I still want to shop on Amazon and use my iPhone, but I also want some of these big problems to be solved.
MEHRAN SAHAMI: And that’s the place I think we can get to with a little bit of regulation and thinking more about the values we care about instilling in the companies and in society.
ALISON BEARD: Terrific. Mehran, thanks so much for talking with me today.
MEHRAN SAHAMI: Thank you for taking the time. Really appreciate it.
ALISON BEARD: That’s Mehran Sahami, a professor at Stanford and coauthor of the book System Error: Where Big Tech Went Wrong and How We Can Reboot.
If you liked this episode and want to hear more, like my interview with Best Buy’s Hubert Joly on walking the talk of shareholder capitalism, please look for us on your favorite podcast app.
This episode was produced by Mary Dooe. We get technical help from Rob Eckhardt. Adam Buchholz is our audio product manager. Thanks for listening to the HBR IdeaCast. I’m Alison Beard.