Last month, after the Wall Street Journal published a series of articles drawn, in part, from tens of thousands of pages of leaked internal Facebook documents that purportedly show the ways in which the company sometimes prioritizes profit over the public good, and after the Senate Commerce Committee used those documents to grill Facebook’s head of global safety, Antigone Davis, about the company’s knowledge of the harm it causes, one question remained: “What happens now?” Facebook, which said that the Journal series mischaracterized its research and presented it out of context, ignoring the positive aspects of social media, has weathered bad P.R. in the past, and Congress has so far failed to pass significant regulatory legislation, so the likely answer may have been “not much.” Then, on Sunday night, on “60 Minutes,” Frances Haugen, a thirty-seven-year-old former Facebook data scientist, came forward not only to acknowledge that she was the whistle-blower who had leaked the documents but to reveal that they formed the basis of at least eight complaints that she and her lawyers had filed with the Securities and Exchange Commission. In addition, she said, she had shared the documents with members of the European and British parliaments. In a blog post published late Tuesday night, Mark Zuckerberg, the Facebook C.E.O., said that Haugen’s claims “don’t make sense,” adding that “at the heart of these accusations is this idea that we prioritize profit over safety and well-being. That’s just not true.”
Haugen worked for nearly two years at Facebook, where she was the lead product manager on the civic-integrity team, which the company later subsumed into a larger department. Before she decided to comb through Facebook’s research and message boards looking for evidence that the company was aware of its toxic impacts, she consulted with her mother, an Episcopal priest in Durant, Iowa. Haugen considers herself to be a rule-follower, she told the Journal. But her mother told her that, if she believed Facebook was putting lives in danger, she should do what she could to save those lives. So she chose to gather and then share evidence of, among other things, the ill effects that Instagram, the photo-sharing app that Facebook acquired in 2012, can have on teen-agers’ emotional and physical health, and the ways in which Facebook’s algorithms can promote extremism and foster internecine conflict.
Then, on Tuesday, Haugen, a self-possessed Midwesterner, sat for hours in a Senate hearing room, testifying before the Subcommittee on Consumer Protection, Product Safety, and Data Security. There she repeated, with meticulous candor, what she had learned during her time at Facebook and from the internal documents that she’d appropriated last spring, before she left the company. “I came to realize the devastating truth. Almost no one outside of Facebook knows what happens inside of Facebook,” she told the senators. “The company intentionally hides vital information from the public, from the U.S. government, and from governments around the world.” She added, “Facebook wants you to believe that the problems we are talking about are unsolvable. They want you to believe in false choices. They want you to believe that you must choose between a Facebook full of divisive and extreme content or losing one of the most important values our country was founded upon, free speech. . . . That to be able to share fun photos of your kids with old friends you must also be inundated with anger-driven virality. They want you to believe that this is just part of the deal. I am here today to tell you that that’s not true.”
Much of that virality, Haugen said, is caused by Facebook’s amplification algorithms, which the company’s “growth division” constantly tweaks to make content circulating on the platform more “sticky.” The stickier the content, the more time users spend on the platform, and the more money Facebook makes. The simple equation undergirding the operation is: more time equals more ads equals more revenue. Zuckerberg, in his blog post Tuesday night, wrote, “The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.” The company also claims that it has made changes to prioritize “meaningful social interactions” between its users over viral video content. But, as Haugen told it, the company knows that extreme content generates the most user engagement, so the platform is built to promote it by rewarding the post (and its poster) with likes, shares, and comments. Those, in turn, send little hits of dopamine to the original poster, encouraging the creation of ever more extreme posts—call it a vicious cycle of viciousness. Facebook knows this from its own research, Haugen testified, and there is no incentive to change it.
In her S.E.C. complaint, Haugen highlights a Facebook study called “Carol’s Journey to QAnon,” which found that when a hypothetical user named Carol (in the study) “set out to follow conservative political news and humor content generally . . . page recommendations began to include conspiracy recommendations after only 2 days.” It took less than a week for the algorithm to offer up QAnon content. In a long internal memo addressed to Facebook employees, and prior to Haugen’s testimony, Nick Clegg, the former British Deputy Prime Minister who is now Facebook’s vice-president of global affairs and communications, wrote that, “if it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t. In fact, polarization has gone down in a number of countries with high social media use at the same time that it has risen in the US.” (Still, according to a May, 2020, article in the Wall Street Journal, a slide from a 2018 Facebook presentation reads, “Our algorithms exploit the human brain’s attraction to divisiveness.” If that process is left unchecked, the slide states, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”)
Tuesday’s hearing, titled “Protecting Kids Online: Testimony from a Facebook Whistleblower,” was convened, primarily, to discuss the risks that Facebook and especially Instagram pose to young people. Senator Marsha Blackburn, in a statement on the Senate floor, observed that Facebook’s own research indicates that sixty-six per cent of teen girls and forty per cent of teen boys experience “negative social comparisons”—measuring their appearance against celebrities’ and other influencers’—when they use Instagram, which can contribute to a cascade of mental-health issues, including eating disorders and suicidal thoughts. The research also indicates that, even as Instagram makes teen-agers unhappy or ill, they find it difficult to stop using the app. What others call “addiction,” Haugen said, Facebook calls “problematic use.” She also noted that adolescents often make secret Instagram accounts, which they hide from their parents and from which the company benefits, because they boost advertising revenue. Senator Richard Blumenthal brought up these fake Instagram accounts, sometimes called “finstas,” both at Tuesday’s hearing and during an exchange with Antigone Davis last week. He stated, “In multiple documents, Facebook describes these secret accounts as ‘a unique value proposition.’ It’s a growth strategy, a way to boost its monthly active-user metric.” Davis responded that “finstas are not something that we actually built.” She added, about users, “They built—they did it to actually provide themselves with a more private experience.” (What Blumenthal appeared to be referring to are so-called SUMAs, meaning “same user with multiple accounts.” A Facebook report referenced in Haugen’s S.E.C. complaint notes that “over 15% of new teen accounts are existing users creating a SUMA child account.”)
Haugen is not the first former Facebook employee to warn of the ways in which the platform is used to stoke social divisions, create conflict, and incite violence. Just last year, another former Facebook data scientist, Sophie Zhang, wrote a nearly eight-thousand-word memo as she was leaving the company, alleging that the platform enables politicians, heads of state, and political parties to create fake accounts used to harass their opposition and influence elections. Yael Eisenstat, a former C.I.A. officer and a national-security adviser to then Vice-President Joe Biden, has recounted in TED Talks and elsewhere that she left Facebook, in 2018, after only a few months, once it became clear to her that there was little interest from Facebook leadership in her doing the job that she had been hired to do: to combat misinformation in political ads, as the head of global elections-integrity operations. In response to Zhang’s memo, Facebook said, “We fundamentally disagree with Ms. Zhang’s characterization of our priorities and efforts to root out abuse on our platform. As part of our crackdown against this kind of abuse, we have specialized teams focused on this work and have already taken down more than 150 networks of coordinated inauthentic behavior.” The company has never directly commented on Eisenstat’s criticisms, but, as a policy, it does not fact-check political ads.
Haugen’s emergence coincides with other events in a negative-news cycle that appears to have Facebook scrambling to keep up. In last week’s Senate hearing, Davis ended up trying to discount Facebook’s own research, saying, “I want to be clear that this research is not a bombshell. It’s not causal research. It is, in fact, just directional research that we use for product teams.” She was interrupted by Blumenthal, who insisted that it was, in fact, “a bombshell.” On Monday, in between Haugen’s appearances on “60 Minutes” and in the Senate, Facebook filed its response to a Federal Trade Commission lawsuit that seeks to use antitrust law to break up the tech giant. Though Facebook won the initial round, in June, when the district-court judge James Boasberg ruled that the F.T.C. had not sufficiently proved that the company is a monopoly, the agency’s amended complaint is more substantial. The antitrust argument may have got a boost when the entire Facebook ecosystem had a worldwide outage that demonstrated how entrenched it is in the marketplace. The same day, Ashkan Soltani, a leading expert on privacy law and the former chief technologist at the F.T.C., was appointed to head the new California Privacy Protection Agency, where he will be in charge of enforcing the California Data Privacy Rights Act, which is regarded as the strongest privacy-protection law in the country. On Tuesday, about an hour into Haugen’s Senate appearance, Facebook’s policy communications director, Andy Stone, in an apparent effort to blunt her testimony, took to Twitter, writing that she was talking about research that she had not conducted herself and had “no direct knowledge of.” In her testimony, Haugen acknowledged that she didn’t work directly on the Instagram research and was relying on the analyses of those who did.
Note: This article have been indexed to our site. We do not claim legitimacy, ownership or copyright of any of the content above. To see the article at original source Click Here