On Monday, four days before a Senate Commerce Committee hearing called “Protecting Kids Online: Facebook, Instagram and Mental Health Harms,” Facebook announced that it was pausing development of a new product—an Instagram app aimed at children aged between ten and twelve. (Facebook currently restricts its products to users thirteen and older.) The Times called the company’s decision “to halt the app’s development . . . a rare reversal for Facebook,” but, according to Adam Mosseri, the head of Instagram, Facebook isn’t abandoning the app but working to make it more acceptable to parents and policymakers, and to “demonstrate the value and importance of this project for younger teens online today.” Critics see the move as an effort to appease members of Congress in order to stave off meaningful regulation.
Instagram, Facebook’s popular photo-sharing app, which the company acquired in 2012, for a billion dollars, to shore up its presence on mobile devices, is now valued at around a hundred billion dollars and is central to the company’s continued growth. Yet parents and child-development experts, among others, have long argued that the app, which presents a curated, Photoshopped version of reality, is harmful to young people’s self-esteem, confidence, and over-all mental health. A young person told researchers from the U.K.’s Royal Society for Public Health, for a report published in 2017, “Instagram easily makes girls and women feel as if their bodies aren’t good enough as people add filters and edit their pictures in order for them to look ‘perfect.’ ” Another user reported that “Bullying on Instagram has led me to attempt suicide and also self-harm. Both caused me to experience depressive episodes and anxiety.” According to the Wall Street Journal, more than forty per cent of Instagram users are younger than twenty-three.
Senator Richard Blumenthal, of Connecticut, the chair of the subcommittee on Consumer Protection, Product Safety, and Data Security, and his Republican colleague Senator Marsha Blackburn, of Tennessee, are convening Thursday’s hearing. In August, they asked Mark Zuckerberg, Facebook’s C.E.O., to release the company’s internal research on the mental-health effects of its platforms on young people. Facebook has been conducting such research for several years through online surveys, focus groups, and large-scale surveys, but the company refused the senators’ request. (A Facebook spokesperson said that the research is proprietary and intended “to promote frank and open dialogue” inside the company.) A few weeks later, some of those studies were included in a cache of internal Facebook documents that were leaked to the Wall Street Journal and published as part of a series of articles called “The Facebook Files,” written by a team of reporters led by Jeff Horwitz. As reported by the Journal, the documents show that the company is fully aware that Instagram has deleterious effects on teens. A PowerPoint slide created by Facebook researchers in 2019, for example, states that Instagram makes body-image issues worse for one in three teen-age girls. Another research presentation, from March, 2020, which was published on Facebook’s internal message board and was viewed by the reporters, noted that “thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” Teens also told Facebook’s researchers that the app contributed to their depression and anxiety, a complaint that a company document from 2019 noted “was unprompted and consistent across all groups.” Young Instagram users also indicated that they felt addicted to the app and lacked the wherewithal to limit their use of it.
In a post on Facebook’s blog responding to the Journal, Pratiti Raychoudhury, the company’s head of research, wrote that the body-image statistic cited was misleading as presented, and that other research shows that Instagram is, in fact, helpful for teens. Zuckerberg made a similar claim during a congressional hearing earlier this year, saying that the app could help kids “stay connected,” and there is evidence that, during the pandemic lockdown, young people did benefit from turning to social media to blunt their isolation. Even so, as Blumenthal told the Journal’s reporters, “Facebook seems to be taking a page from the textbook of Big Tobacco—targeting teens with potentially dangerous products while masking the science.” He and Blackburn scheduled the hearing shortly after the Journal articles were published. Antigone Davis, Facebook’s global head of safety, is slated to testify.
By any measure, even for a company that spends nearly twenty million dollars a year lobbying lawmakers, “The Facebook Files” should be calamitous. The series also reports that a little-known Facebook program called “XCheck” or “cross check,” which exempts from the company’s content restrictions high-profile accounts of celebrities, politicians, influencers, and those who might pose a public-relations problem if their posts were taken down, had grown to include close to six million users by 2020. In June, according to the Journal, the company told its Oversight Board that XCheck was only used in “a small number” of content-related decisions. (A spokesperson for the company said that Facebook had made “significant progress in addressing the challenges in the program.”) The series additionally reports that, despite efforts at mitigation, Facebook employees flagged some drug cartels and human traffickers that still operate openly on both Facebook and Instagram, and that Facebook continues to propagate misinformation and disinformation about COVID-19 vaccines. (On Tuesday, the Times reported that Facebook groups promoting the use of the anti-parasitic drug ivermectin to treat COVID are still generating thousands of interactions every day.) And the series further reports that the company may sometimes enable repressive regimes in fast-growing emerging markets to silence its critics for fear of losing access to those markets. In particular, the company curtailed “access to dissident political content deemed illegal in exchange for the Vietnamese government ending its practice of slowing Facebook’s local servers.” The reporters note that, according to a former Facebook employee who worked in Asia, the company acceded to this practice “because Vietnam is a fast-growing advertising market.” (Nick Clegg, the company’s vice-president of global affairs, wrote that the Journal articles “have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees.”)
In a crucial way, though, Facebook’s accommodation of foreign political leaders is comparable to its plan to create an Instagram app for kids. Facebook executives prioritize growth, and so they need to find new markets, both geographic—Vietnam, for instance—and demographic, such as children under thirteen. In the latest installment of “The Facebook Files,” published on Tuesday, Horwitz and Georgia Wells write that “the company formed a team to study preteens, set a three-year goal to create more products for them, and commissioned strategy papers about the long-term business opportunities presented by these potential users. In one presentation, it wondered whether there might be a way to engage children during play dates. ‘Why do we care about tweens?’ said one document from 2020. ‘They are a valuable but untapped audience.’ ” (The same day, the company said this research was commissioned to create “safer and age-appropriate options for children.”)
When Senators Blumenthal and Blackburn announced Thursday’s hearing, they said that the committee would “use every resource at our disposal to investigate what Facebook knew and when they knew it—including seeking further documents and pursuing witness testimony.” On Tuesday, they confirmed that a whistle-blower who has been aiding the committee’s investigation will testify in Congress on October 5th. It does not help the company that, as Politico reported last week, on August 6th, two groups of Facebook shareholders filed amended complaints against Zuckerberg, Sheryl Sandberg, and other Facebook board members. (The shareholders originally filed suit in 2020, and the new filings include information from documents that they acquired as a result.) Among many claims, one of the suits alleges that the five-billion-dollar fine that the Federal Trade Commission levied against Facebook in 2019, for its role in the Cambridge Analytica data scandal, was, in effect, an overpayment—a quid pro quo to insure that Zuckerberg would not be held personally liable. Five billion dollars is the largest fine ever imposed by the F.T.C. for violating consumer privacy. (At the time of the original filing, a Facebook spokesperson said that the lawsuit was without merit; the company declined to comment this week.)
On Monday, a few hours after Facebook announced that it was pausing work on the Instagram app for kids, Blumenthal and three other Democratic representatives, including Senator Ed Markey, of Massachusetts, called on the company to “completely abandon this project.” This was a plea, not a proviso, because, so far, supplication is the only expedient available to them. But perhaps Congress will move beyond the usual performative theatrics of chastising Big Tech executives in the hearing room and begin to craft robust legislation to curb social media’s antisocial effects. The evidence is before it.
New Yorker Favorites
- How we became infected by chain e-mail.
- Twelve classic movies to watch with your kids.
- The secret lives of fungi.
- The photographer who claimed to capture the ghost of Abraham Lincoln.
- Why are Americans still uncomfortable with atheism?
- The enduring romance of the night train.
- Sign up for our daily n ewsletter to receive the best stories from The New Yorker.
Note: This article have been indexed to our site. We do not claim ownership or copyright of any of the content above. To see the article at original source Click Here