top of page
Search
  • snitzoid

Meta admitted what about teens? Internal docs?

Mark, baby it's perfectly ok to dish out an addictive product that pancakes a teen's psyche? Just don't put it in writing. I guess they didn't teach you that in school.



Meta Designed Products to Capitalize on Teen Vulnerabilities, States Allege

Newly unredacted documents in complaint by attorneys general show Meta conversations about age, product design and potential harms

By Jeff Horwitz, WSJ

Nov. 25, 2023


Meta says it didn’t design its products to be addictive for teens.


Meta META -0.95%decrease; red down pointing triangle Platforms sought to design its social-media products in ways to take advantage of known weaknesses of young users’ brains, according to newly unredacted legal filings citing internal company documents.


An internal 2020 Meta presentation shows that the company sought to engineer its products to capitalize on the parts of youth psychology that render teens “predisposed to impulse, peer pressure, and potentially harmful risky behavior,” the filings show.

References to the documents were initially redacted in the suit, which was filed in late October by members of a coalition of 41 states and the District of Columbia, alleging that Meta has intentionally built Facebook and Instagram with addictive features that harm young users. Meta approved the filing of an unredacted version on Wednesday.


“Teens are insatiable when it comes to ‘feel good’ dopamine effects,” the Meta presentation shows, according to the unredacted filing, describing the company’s existing product as already well-suited to providing the sort of stimuli that trigger the potent neurotransmitter. “And every time one of our teen users finds something unexpected their brains deliver them a dopamine hit.”


Well-being concerns were especially pronounced for younger teens, some Meta executives involved with youth well-being issues internally acknowledged.

“It’s not ‘regulators’ or ‘critics’ who think Instagram is unhealthy for young teens—it’s everyone from researchers and academic experts to parents,” Karina Newton, Instagram’s head of policy, wrote in a May 2021 email cited by the attorneys general. “The blueprint of the app is inherently not designed for an age group that don’t have the same cognitive and emotional skills that older teens do.”


Meta says it didn’t design its products to be addictive for teens.


“The complaint mischaracterizes our work using selective quotes and cherry-picked documents,” said Stephanie Otway, a spokeswoman for the company.

Antigone Davis serves as Meta’s head of global safety.


Meta also allegedly condoned usage of Facebook and Instagram by preteens, according to the unredacted court filings.


The states allege that Meta has long known that its platform has weak protections against usage by children below the age of 13, who are generally barred from the platform by both Meta’s rules and federal law. In the U.S., company algorithms estimated Meta has as many as four million underage users. Rather than seeking to crack down on underage usage, according to the complaint, Meta created charts “boasting Instagram’s penetration into 11- and 12-year-old demographic cohorts.”


“In December 2017, an Instagram employee indicated that Meta had a method to ascertain young users’ ages but advised that ‘you probably don’t want to open this pandora’s box’ regarding age verification improvements,” the states say in the suit.

Some senior executives raised the possibility that cracking down on underage usage could hurt Meta’s business. In a 2019 email, Meta’s head of global safety, Antigone Davis, asked Nick Clegg, the company’s president of global affairs, to clarify whether the goal for identifying users under the age of 13 was to remove them “or whether we are waiting to test growth impact before committing to anything.”


Davis later expressed frustration that the company appeared willing to study underage usage for business reasons but not for efforts to remove them, according to a 2020 email cited in the complaint. The states say Meta made little progress on automated detection systems or adequately staffing the team that reviewed user reports of underage activity.


“Meta at times has a backlog of 2-2.5 million under-13 accounts awaiting action,” according to the complaint.


Otway, the Meta spokeswoman, said Instagram works to remove underage users when it finds them. Because verifying the age of people online is a complex industry problem, she said, the company supports legislation that would allow parents to control what apps users under 16 can download.


The unredacted citations also demonstrate the company’s communications staff has, at points, expressed qualms about the difficulty of arguing that Meta is a responsible steward of young users.


“Our own research confirmed what everyone has long suspected,” Otway wrote to Instagram head Adam Mosseri after The Wall Street Journal notified the company in August 2021 that it had obtained records in which Instagram’s well-being team concluded that the platform negatively affected the self-esteem of a significant portion of teen girls.


The unredacted material also includes allegations that Meta Chief Executive Mark Zuckerberg instructed his subordinates to give priority to boosting its platforms’ usage above the well being of users.


In one email thread stretching from late 2017 into early 2018, Zuckerberg’s top deputies—including Chief of Product Chris Cox and current Chief Marketing Officer Alex Schultz—backed a proposal to ease off the company’s heavy use of notifications, which are push alerts meant to bring users onto Facebook and Instagram more regularly. Such notices were internally thought to aggravate what the company called “problematic use.” Such use included when users reported that their inability to stop using social media was detrimental to their work, sleep or social life.


Problematic use was especially an issue for teens, according to internal studies previously reviewed by the Journal. But Meta relied on the product to increase usage growth, especially among teens, who had a “higher tolerance” for being interrupted by notifications than adult users.


“Fundamentally I believe that we have abused the notifications channel as a company,” wrote Schultz in the unredacted email thread, concurring with Cox, who said the company shouldn’t back off doing what was “better for people” because usage metrics were down.


Zuckerberg overrode them, according to the unredacted portions of the complaint, with executive Naomi Gleit, now head of product at Meta, saying that daily usage “is a bigger concern for Mark right now than user experience.”


Zuckerberg also repeatedly dismissed warnings from senior company officials that its flagship social-media platforms were harming young users, according to unsealed allegations in a lawsuit filed by Massachusetts earlier this month.


Otway disputed the contention by the state attorneys general that the company gave priority to its own well being over that of its users.


Meta Platforms Chief Executive Mark Zuckerberg dismissed warnings from senior company officials that its flagship social-media platforms were harming young users, according to unsealed allegations in a lawsuit.


“This conversation—from over five years ago—has nothing to do with people’s well being,” Otway said, calling user experience “a broad term.” Otway added that the company subsequently added optional features such as “quiet mode” that encourages users to consider closing the app when scrolling late at night.


The complaint cites numerous other executives making public claims that were allegedly contradicted by internal documents. While Davis told Congress that the company didn’t consider profitability when designing products for teens, a 2018 internal email stated that product teams should keep in mind that “The lifetime value of a 13 y/o teen is roughly $270” when making product decisions.


While the company publicly played down its responsibility for contributing to the death of Molly Russell—a British 13-year-old who took her life after consuming what a coroner later concluded was a steady stream of recommended content glorifying self-harm—an internal document “found a palpable risk of ‘similar incidents’ because its algorithmic Platform features were ‘leading users to distressing content,’” according to an internal Meta document cited in the unredacted complaint.


Write to Jeff Horwitz at jeff.horwitz@wsj.com

Copyright ©2023 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8


10 views0 comments

Recent Posts

See All

Comentarios


Post: Blog2_Post
bottom of page