But that picture is at odds with the departures of people who talk about having blood on their hands.
I spoke last week to a former researcher whose badge post I did not see in the Facebook Papers. She told me she would be in a room and provide examples of users she had spoken to, victims of hate speech or harassment. “And there are no women on those product meetings,” she says. “We as researchers in privacy and safety would present these stories that were pretty shocking, like ‘Here’s just one woman I spoke to, and in the course of one day, she got 40 direct messages from people that she didn’t even know and was being harassed.’ But you have to present it with other data, quantitative data. Sometimes that sort of small story gets lost.”
And all too often the problem doesn’t get solved. “If you’re a ‘lowly product manager’ you could be doing the best work in the world, but if you don’t get X number of new users to sign up, you don’t get your bonus, or you don’t get promoted,” she says. To truly address the problems, “The way that the company incentivizes product teams would radically have to change,” she adds.
Another complication: Facebook is structured to resist such change. Making a product shift to improve safety or reduce misinformation in something like the News Feed involves work from several teams, sometimes in the double digits. As one badge poster noted, making an integrity change that improves safety requires approval from multiple departments. But it only takes one “no” to stop that change from happening.
Even worse is the resistance that comes from higher-ups in Facebook’s food chain. “Integrity teams are facing increasing barriers to building safeguards,” a researcher said in a badge post on August 25, 2020. “In recent months, I’ve seen promising interventions from integrity product teams, with strong research and data support, be prematurely stifled or severely constrained by key decision-makers—often based on fears of public and policy stakeholder responses … Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms.”
I’ve spent hundreds of hours in the past few years talking to Facebook employees, including Mark Zuckerberg, and diving into the way the company operates. Nonetheless, I found the Facebook Papers revelatory—not because they contain major surprises about the weaknesses, conflicts, and unacceptable compromises made by Facebook and its leaders, but because they expose how thoroughly aware those leaders were of the platform’s flaws. Over the past few weeks, comparisons between Facebook and Big Tobacco have gained popularity. But Nick Clegg has pushed back on this analogy, and I actually agree with him. There is no mitigating factor in tobacco: No one’s health is improved by cigarettes, and they will kill you. Instead, when I look through these documents—which prove that so many of the terrible things we heard about Facebook were duly reported and documented by its researchers and presented to company leaders—I think of another corporate crisis, one that happened two years before Mark Zuckerberg was born.
Early one morning in September 1982, the parents of 12-year-old Mary Kellerman of the Chicago suburb of Elk Grove found their daughter dying on the bathroom floor. Hours earlier, she had complained of a cold, and her parents had given her one capsule of Extra-Strength Tylenol, the nation’s most popular remedy for minor discomfort. Hers was among three poisoning deaths reported that day, and each victim had taken Tylenol caps laced with cyanide. The death toll would soon reach seven.