Meta designed platforms to get children addicted, court documents allege | Meta

Instagram and Facebook parent company Meta purposefully engineered its platforms to addict children and knowingly allowed underage users to hold accounts, according to a newly unsealed legal complaint.

The complaint is a key part of a lawsuit filed against Meta by the attorneys general of 33 states in late October and was originally redacted. It alleges the social media company knew – but never disclosed – it had received millions of complaints about underage users on Instagram but only disabled a fraction of those accounts. The large number of underage users was an “open secret” at the company, the suit alleges, citing internal company documents.

In one example, the lawsuit cites an internal email thread in which employees discuss why a 12-year-old girl’s four accounts were not deleted following complaints from the girl’s mother stating her daughter was 12 years old and requesting the accounts to be taken down. The employees concluded that “the accounts were ignored” in part because representatives of Meta “couldn’t tell for sure the user was underage”.

The complaint said that in 2021, Meta received over 402,000 reports of under-13 users on Instagram but that 164,000 – far fewer than half of the reported accounts – were “disabled for potentially being under the age of 13” that year. The complaint noted that at times Meta has a backlog of up to 2.5m accounts of younger children awaiting action.

The complaint alleges this and other incidents violate the Children’s Online Privacy and Protection Act, which requires that social media companies provide notice and get parental consent before collecting data from children.

The lawsuit also focuses on longstanding assertions that Meta knowingly created products that were addictive and harmful to children, brought into sharp focus by whistleblower Frances Haugen, who revealed that internal studies showed platforms like Instagram led children to anorexia-related content. Haugen also stated the company intentionally targets children under the age of 18.

Company documents cited in the complaint described several Meta officials acknowledging the company designed its products to exploit shortcomings in youthful psychology, including a May 2020 internal presentation called “teen fundamentals” which highlighted certain vulnerabilities of the young brain that could be exploited by product development.

The presentation discussed teen brains’ relative immaturity, and teenagers’ tendency to be driven by “emotion, the intrigue of novelty and reward” and asked how these asked how these characteristics could “manifest . . . in product usage”.

Meta said in a statement that the complaint misrepresents its work over the past decade to make the online experience safe for teens, noting it has “over 30 tools to support them and their parents”.

With respect to barring younger users from the service, Meta argued age verification is a “complex industry challenge”.

Instead, Meta said it favors shifting the burden of policing underage usage to app stores and parents like Google and Apple, specifically by supporting federal legislation that would require app stores to obtain parental approval whenever youths under 16 download apps.

One Facebook safety executive alluded to the possibility that cracking down on younger users might hurt the company’s business in a 2019 email.

But a year later, the same executive expressed frustration that while Facebook readily studied the usage of underage users for business reasons, it didn’t show the same enthusiasm for ways to identify younger kids and remove them from its platforms.


This website uses cookies. By continuing to use this site, you accept our use of cookies.