Media

Frances Haugen takes on Facebook: the making of a modern US hero | Facebook


The journey from disillusioned ex-employee to modern-day heroine took Frances Haugen less than five months. The 37-year-old logged out of Facebook’s company network for the last time in May and last week was being publicly lauded a “21st-century American hero” on Washington’s Capitol Hill.

That journey was paved with tens of thousands of internal documents, taken from Facebook’s internal system by Haugen, that formed the backbone of a series of damning revelations first published in the Wall Street Journal last month. They revealed that Facebook knew its products were damaging the mental health of teenage girls, resisted changes that would make the content of its main platform less divisive and knew its main platform was being used to incite ethnic violence in Ethiopia.

The ensuing public backlash tipped Facebook into its biggest crisis since the Cambridge Analytica scandal of 2018 and culminated in damning testimony by Haugen in front of US senators last Tuesday. Her opening words were delivered against an excruciating backdrop for Facebook: only hours earlier all its services – including its eponymous platform, the Instagram photo and video sharing app and the WhatsApp messaging service – went offline for six hours due to a maintenance error that affected the company’s 2.8 billion daily users. Facebook’s services then suffered more glitches on Friday.

Frances Haugen arrives to testify before a Senate subcommittee on Capitol Hill
Frances Haugen, former Facebook employee turned whistleblower, arrives to testify before a Senate subcommittee on Capitol Hill on 5 October 2021. Photograph: Kevin Dietsch/Getty Images

“I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy,” Haugen told a senate subcommittee. “The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They won’t solve this crisis without your help.” In 2020, Facebook reported a net income – a US measure of profit – of more than $29bn (£21bn).

In about four hours of testimony, Haugen gave a detailed account of her near-two-year stint at Facebook as part of a team looking at preventing election interference on its platforms. She repeatedly referred to the company choosing growth and profit over safety and warned that Facebook and Instagram’s algorithms – which tailor the content that a user sees – were causing harm. In one exchange, she told senators that Facebook knew Instagram users were being led to anorexia-related content. She said an algorithm “led children from very innocuous topics like healthy recipes … all the way to anorexia-promoting content over a very short period of time”.

Haugen was lauded by her interlocutors, with Democrat Senator Ed Markey thanking her for becoming a “21st-century American hero”.

Haugen is not the first whistleblower to raise concerns about the tech giant. In 2018, Christopher Wylie, a Canadian data analyst, revealed to the Observer that his former employer, Cambridge Analytica, had harvested millions of Facebook profiles of US voters. One year later, Facebook was fined $5bn by the US Federal Trade Commission for “deceiving” users about its ability to keep personal information private. At the time of Wylie’s revelations, Facebook was contrite, taking out full-page adverts in British and US newspapers to apologise and state that it was limiting the data its apps get.

Speaking to the Observer last week, Wylie said he had relived his own experience as a whistleblower by watching Haugen. But he also found the flashbacks frustrating – because nothing has changed.

“I do think we are back in 2018, talking about all of this being new. But it’s not,” says Wylie, adding, in a tribute to Haugen: “It takes a lot to confront Facebook.”

Wylie, who is now global head of insight and emerging technologies at fashion retailer H&M, says he blew the whistle to warn authorities about the iniquities of social media. “The reason I did this was to inform regulators and legislators about what was going on … that it has to be taken seriously and that there are safety issues with these platforms.”

So for Wylie, seeing Haugen warn that Facebook’s algorithms are a danger to the public good made him feel like an opportunity has already been missed.

Facebook CEO Mark Zuckerberg
Facebook CEO Mark Zuckerberg prepares to face a joint US Senate committee in April 2018. Insiders say Facebook has never been popular in Washington. Photograph: Jim Watson/AFP/Getty Images

“The fact that we are still having a conversation about what is happening, not what are we going to do about it, I find slightly exasperating,” he says.

However, momentum is building among politicians on both sides of the Atlantic to do something. “Facebook is like big tobacco, enticing young kids with that first cigarette,” said Senator Markey at the hearing. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer.”

There was certainly little warmth shown towards Mark Zuckerberg, Facebook’s founder and chief executive, amid references by senators to an Instagram post showing him sailing with his wife Priscilla in the weekend running up to hearing. In one exchange, Haugen said: “The buck stops with him.”

One Silicon Valley executive told the Observer that this lack of regard extended beyond the subcommittee on consumer protection. Facebook, and Mark Zuckerberg, were never that popular in Washington and were even less so now. “Facebook never had the kinds of friends that the likes of Google did. Now, after this, they will find themselves even more alone.”

In the US alone there are numerous political, legal and regulatory moves afoot. Senators are pushing reforms to section 230 of the Communications Decency Act, which exempts social media companies from liability for what is posted on their networks, the Federal Trade Commission is suing to break up Facebook, and Haugen’s lawyers have filed at least eight complaints, based on her leaked documents, with the US financial watchdog.

Charles Arthur, author of a book about the dangers of social media, Social Warming, has advocated breaking all social networks into discrete geographical entities. If that were to happen, he argues, Facebook, Twitter, etc, would be better able to moderate their platforms.

“The problems increase geometrically if you increase the size of the network arithmetically,” says Arthur, a former Guardian journalist.

“If you have a network of 100 people you have a certain number of interactions that are possible. But if you have 200 people you have four times as many interactions that could be problematic. Then with 400 it’s 16 times as many. The problems scale up faster than the network, but the companies are not increasing the amount of moderation in the same way.”

The answer, says Arthur, is multiple breakups. “So the solution is that we limit the size of the networks. If you say that media owners like the Murdochs cannot own a certain amount of newspapers, then you say to social media networks you cannot be bigger than a certain size,” he says, adding that limiting the size geographically, by country is the easiest solution to implement and legislate. Facebook argues that only companies with considerable resources, like its own, can cope with the task of moderating vast amounts of content.

Sheryl Sandberg and Mark Zuckerberg
Sheryl Sandberg, Facebook’s chief operating officer, with Mark Zuckerberg at a conference in July. Photograph: Andrew H Walker/REX/Shutterstock

In the UK, the online safety bill will take on the work of regulating Facebook. In draft form currently, the bill imposes a duty of care on social media companies to protect users from harmful content, under the threat of fines from communications regulator Ofcom if they fail in that duty. Damian Collins, the Conservative MP chairing the joint committee scrutinising the bill, says Haugen’s testimony and document leaks “lead you to question Facebook’s moral compass as a company”. He told the Observer that the online safety bill is needed because it showed that decisions such as how to treat harmful content could not be left to Facebook. “It shows you these decisions cannot be left to them. The call for regulation will be even greater now.”

Collins adds that the sort of documents and internal research released by Haugen must be made available to an independent regulator under the new regime. “How can the regulator be effective if key information about the platform and how it is being run is being kept from them?”

The education secretary, Nadim Zahawi, added to the UK political pressure on Facebook on Saturday when he told the Times that Instagram was not fulfilling its duty of care to young people because it is “pumping out” harmful material to them.

For Wylie, one answer is to regulate social media companies, and their algorithms, like the pharmaceutical, aerospace and automotive industries.

Observer front pages

“One of the failures of public discourse around all of the problems with big tech and algorithms is that we fail to understand that these are products of engineering. We do not apply to tech companies the existing regulatory principles that apply in other fields of engineering, we do not require safety testing of algorithms,” he says, warning that social media products and algorithm changes are released to the public unchecked by regulators.

Facebook was contrite after Cambridge Analytica but it has been trenchant in the face of Haugen’s revelations. In a blogpost written after Haugen’s testimony, Zuckerberg said her claims that the company puts profit over people’s safety are “just not true”.

“At the heart of these accusations is this idea that we prioritise profit over safety and wellbeing. That’s just not true,” he said. He added: “The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.”

Referring to one of the most frequently cited Haugen revelations, that Facebook failed to act on internal research showing that Instagram had a negative impact on the mental health of teenage girls, he said the claims “don’t make any sense”. One particularly damaging document showed that for teenage girls already having “hard moments”, one in three found Instagram made body issues worse. “If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?” he wrote.

Shoshana Zuboff
Professor Shoshana Zuboff is a longstanding critic of the big tech companies. Photograph: David Levene/The Guardian

Shoshana Zuboff, a Harvard professor and the author of The Age of Surveillance Capitalism, said that Haugen had given the world a unique insight. “Frances Haugen has given us an X-ray view into the machine operations of surveillance capitalism in one company. And it is a very consequential company.”

Zuboff says Facebook, along with Google and many others, is a pure distillation of her thesis: that big tech companies secretly mine personal experience, turn it into data, and generate behavioural predictions that they sell to business customers. Facebook makes $84bn of its $86bn in annual revenues from advertising. “These systems rely on surveillance to invade our once ‘private’ experience with operations designed to bypass individual awareness. In other words, human experience is redefined as free raw material for the massive-scale extraction of behavioural data. The most intimate data is prized for its predictive value: what individuals or types of people are most likely to click on an ad and buy its products, who will pay their bills, get sick or drive safely. They sell human futures – predictions of what we will do next and later.”

Zuboff warns that it would be a mistake for Haugen’s testimony to be interpreted as a problem attached to a single company or leader. For Zuboff, the economic playbook has now been picked up by multiple sectors, from education and health to cars and agriculture. Every sector is looking for ways to generate profit from people’s data, she says.

“Even today, the public and lawmakers have not grasped these facts as deeply as they must,” she says. Echoing Wylie, Zuboff adds that the Cambridge Analytica revelations failed to produce meaningful change. Haugen’s documentation and insights are a fresh opportunity for the public and politicians to get this right, she says: “We have to understand surveillance capitalism’s evolving aims and social harms, otherwise we will have lost another crucial opportunity to enact the new charters of rights and laws that can finally set us on course toward a digital and democratic century.”

For Zuboff, Wylie, politicians and regulators, Haugen is a hero – and a second chance.

The Facebook years

2004: Facebook is founded by then-Harvard student Mark Zuckerberg, its name derived from the face-book directories given to American university students.

2006: After extending access the previous year to high-school students and students outside North American universities, Facebook is opened to members of the public above the age of 13.

2008: Zuckerberg, now Facebook chief executive, hires Sheryl Sandberg from Google as the company’s chief operating officer, a pivotal moment in Facebook’s development as a commercially driven company.

2012: Facebook floats on the Nasdaq stock exchange with a valuation of more than $100bn. It is now worth more than $900bn. It buys Instagram for $1bn – one of the great corporate bargains.

2018: The Observer reveals that Cambridge Analytica, a UK data firm, harvested millions of Facebook profiles of US voters. Facebook is subsequently fined $5bn by the US Federal Trade Commission for “deceiving” users about its ability to keep personal information private.

2021: Frances Haugen, a Facebook product manager, leaves the company in May and takes with her tens of thousands of pages of internal documents that contain a litany of revelations, including: that Instagram knew it was harming the mental health of some teenage girls; that a Facebook algorithm change in 2018 increased divisiveness on the platform; and that nearly 90% of Facebook’s moderation efforts are focused on English content, despite the majority of users being non-English speakers.





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.