The congressional testimony of Frances Haugen is being described as a potential watershed moment after the former Facebook employee turned whistleblower warned lawmakers must “act now” to rein in the social media company.
But the impact of the hearing – in which Haugen used her time at Facebook and leaked internal research to build a case that it is harming children, destabilizing democracies, and putting profits over safety – is uncertain, as lawmakers, experts and regulators remain split over the path forward.
The Guardian spoke to several experts across the tech industry about what could and should lie ahead for Facebook. The interviews have been edited and condensed for clarity.
‘Surveillance capitalism is as immoral as child labor’
Roger McNamee, early Facebook investor and member of Facebook’s oversight board
Frances Haugen’s revelations and testimony before Congress are devastating to Facebook. She is courageous, authoritative, and utterly convincing. We knew about the issues before, but she changed the game by providing internal documents that prove Facebook’s management had early warning of many horrible problems and chose not to take appropriate steps. In her testimony, she confirmed that the incentives of Facebook’s business model lead to the amplification of fear and outrage to the detriment of public health and democracy.
When Haugen notes the moral failing of Mark Zuckerberg prioritizing profits over public safety, we need to recognize that this problem is far bigger than Facebook. All CEOs are told to focus on maximizing shareholder value at all costs. Facebook’s business model – which the Harvard professor Shoshana Zuboff calls surveillance capitalism – employs surveillance to track us and the use of data to manipulate our choices and behavior. It was invented by Google and has since been adopted by Amazon, Microsoft, and companies in every sector of the economy. Regulations must anticipate the harms to come from new use cases.
Haugen has removed the last excuse Congress had for inaction. They now need to legislate in three areas: privacy, safety, and competition. With respect to privacy, people have a right to make their own choices without interference. Surveillance capitalism is as immoral as child labor and should be banned. We also need something like an FDA for tech to ensure that products are safe and new antitrust laws to reduce the harm from monopolies.
‘We need Facebook to die’
Evan Greer, director of Fight for the Future, a digital rights organization
This should be a watershed moment that finally gets lawmakers in DC to get off their butts and pass a real data privacy law. That’s the single most important thing elected officials could do right now to reduce Facebook’s harm. It’s really hard to directly regulate the algorithms Facebook uses, but you can make it illegal for them to harvest all the data they use to power those algorithms.
Facebook’s surveillance capitalist business model is fundamentally incompatible with basic human rights and democracy. That’s why we should push for harm reduction policies such as privacy legislation and antitrust enforcement that address the most immediate and urgent harms of big tech’s monopoly power.
But in the end, we need Facebook to die. We need to make it obsolete by building decentralized, community-driven alternatives, and we need to ensure that those alternatives have a chance to compete with and eventually replace Silicon Valley incumbents.
‘Historic opportunities for regulation’
Fadi Quran, campaign director at Avaaz, a global non-profit activist group
Whether it was Cambridge Analytica, Russian interference in the 2016 elections, or the Rohingya genocide, Facebook has managed again and again to withstand the waves of scrutiny by making some changes to its platform to appease decision-makers. What’s different now is that there is a wave of regulatory momentum in the EU and, to a lesser extent, in the US that can end this vicious cycle.
The ball is in legislators’ court. Facebook has a powerful lobbying effort designed to influence regulations to fit its interests. It is on Joe Biden, Nancy Pelosi, and key members of Congress to ensure that algorithmic transparency and accountability regulation becomes an urgent priority, and on citizens and civil society around the world to mobilize to ensure that big tech lobbyists don’t define the legislative agenda. People’s lives, the psychological health of our children, and the future of our democracies are at stake.
Public trust in Facebook is plummeting, but unfortunately, the company has close to a monopoly with its control over Instagram, WhatsApp, and Facebook – platforms that have become synonymous with the internet in many parts of the world. It is unlikely that the hundreds of millions of users relying on these platforms will step away from them without a massive disruption in the social media and messaging space. However, there is promising legislation being developed in the EU, such as the Digital Services Act, and a number of bills proposed by Congress that have a chance of creating serious protections from the harms of big tech. Our conversations with key leaders in both the US and EU indicate that there will be very serious efforts to regulate the platforms, but Facebook and other tech lobbyists have significant leverage and will do all they can to water these proposals down.
In short, the upcoming year will offer historic opportunities for regulation, and Haugen’s brave revelations have added much-needed urgency, but it will take serious organizing to ensure politicians act effectively.
‘I don’t think it’s going to change perceptions too much’
Daniel Castro, vice president of the non-profit thinktank the Information Technology and Innovation Foundation*
I don’t think it’s going to change perceptions too much. Those who thought Facebook was not doing enough to protect democracy, stop the spread of misinformation or stop bullying will continue to think so and will have more ammunition to make that argument.
We know that bad things are happening online. The question is, well, what do you do about it? And that’s where I think the debate is going to move. Some of that will be calls for more regulation and oversight. Some of it might be going after the company through antitrust laws, data protection laws or proposals around child safety. We may also see more research on how to actually stop the spread of such information online. These are really hard questions that I don’t think one company alone is going to be able to answer.
The question that’s always on the table is, “has a company been truthful?” The immediate takeaway for not just Facebook, but any company operating in this space, is going to be to look very closely at what they’re doing and what they’re saying and make sure those two things align. The internal research Haugen brought to light shows that Facebook was paying attention to some of the societal issues that are being debated. There’s long been a critique of social media and tech in general that they’re ignoring these issues. If anything, this evidence shows that they did care and that they were looking closely at them. That leaves the question of whether they did enough. Clearly the whistleblower doesn’t think so. I think that’s still a question different people can reasonably disagree over. Those same surveys also showed that Facebook was having a positive impact on teenagers and youth.
‘Most people think there’s a problem but don’t agree on the solution’
Gautam Hans, associate clinical professor of Law at Vanderbilt University
Facebook has been embroiled in problems since its inception almost 20 years ago. The company has managed to persist, even through many rounds of regulatory scrutiny, because most people think that there’s a problem but don’t agree on what the solution is. You have a balkanized opposition.
You’ve heard about all sorts of proposals, from passing privacy legislation to some sort of divestment or antitrust remedy to addressing section 230 and the ways in which Facebook has a lot of immunity from lawsuits. Some could work, some cannot …
I think Facebook will survive. It’s too powerful and too robust. It’s hard to think of a world in which it doesn’t exist. But I can certainly see a change in its public perception or structure. But the most obvious tools [to regulate Facebook] have shortcomings.
I don’t want to say it’s hopeless … The corporate pressure of media campaigns, activism, unhappy employees are maybe more effective than any of those legal strategies that people have been putting forward. [Facebook executives] definitely don’t get it yet, but a company like this can’t keep putting their head in the sand.
* ITIF receives donations from some tech industry groups.