Why Facebook should release the Facebook Files
The company's least-bad option might also do a lot of good
Programming note: Platformer will be off on Tuesday as I attend the Code Conference in LA and host a panel about information disorder. I hope to bring you back at least one exquisite piece of gossip.
The fallout from the Wall Street Journal’s Facebook Files series continues. On Sunday the company published a point-by-point rebuttal to the Journal’s story on Instagram’s effects on teenage girls — and then on Monday morning, the company said it would “pause” plans to build Instagram Kids while it consults with more outside groups. A Senate hearing looms on Thursday.
I spent the weekend talking to people in and around Facebook about the situation, and today I want to talk about what I think the company ought to do.
Inside Facebook, some people I’ve spoken with are feeling exasperated. They argue that the Journal series uses relatively few data points to paint Facebook in the worst possible light. To them, it’s more evidence of bias from a press working to bring the company to its knees, reaching predetermined conclusions with whatever scraps of information they can find.
For others, though, particularly those who have worked on research and integrity initiatives, the Facebook Files have been a welcome opportunity to discuss their greatest fear: that despite researchers’ most worrisome findings, Facebook lacks the organizational structure and leadership necessary to prevent it from causing a wide range of preventable harms.
Last week I said this situation represents Facebook’s most significant challenge since the Cambridge Analytica data privacy scandal. It’s not as big as Cambridge Analytica; the Journal series has gotten less coverage overall. (Though that Senate hearing means the balance will continue to shift.) But if another story has generated a news cycle this intense or sustained since 2018, it’s news to me.
In the internal divisions over the Facebook Files, though, I find another echo Cambridge Analytica data. Then, too, there was a set of executives determined to fight back against what they perceived as an almost entirely bogus narrative — and another set that, while mostly in agreement with their peers, understood that the story had raised real fears about the company’s power and influence that would have to be addressed.
Last week I argued here that Facebook ought to address this situation by committing to doing more research like that found in the Facebook Files, rather than less. We know Facebook executives believe that the company has positive overall benefits for the world, and we also know that they are meticulous students of their own data. It’s hard to understand why, if the data is so positive, Facebook is often so reluctant to share it.
So why is that the case? One possibility is suggested by the Facebook Files: that the data about Facebook’s effects on societal issues like polarization, vaccine hesitancy, and children’s self-esteem are substantially negative, and must therefore be hidden. Another is that the data is substantially positive but must be hidden anyway, for reasons owing to run-of-the-mill corporate secrecy or a desire to deploy data more strategically, for PR reasons.
Whatever the case, it seems clear that the current state of affairs is making everyone miserable. So today I want to expand my argument: Not only should Facebook commit to doing more research like the Facebook Files, it should release the Facebook Files, period. And not just the Instagram-related ones, as Nick Clegg suggested Monday. Whatever documents the Journal relied on, Facebook should make them publicly available. Redact them as needed to protect users’ privacy, if need be. Add context, where context is missing.
But release them, and soon.
Here’s my rationale.
One, the files are in the public interest. Among other things, according to the Journal, they contain discussions of political parties who changed their policies based on changes to Facebook’s algorithms, document negative effects of using Instagram on mental health, and reveal that the company devotes vastly more moderation resources to the United States than the rest of the world. On these subjects and more, the public simply has a right to know what Facebook does. One frustration I’ve had over the past week is that Facebook continues to be focused on the public-relations dimension of the story, when the public interest is much more important.
Two, the files will likely come out soon anyway: the whistleblower who leaked them to the Journal is apparently cooperating with Congress. Copies were shown in advance of publication to various researchers. The Journal may yet release them itself (I wish it would.) In any case, it seems likely that they are going to be available for all of us to read soon. Facebook could generate some (admittedly minor) amount of goodwill by doing it voluntarily. (Company spokesman Andy Stone told me the company is sharing the decks with Congress this week.)
Three, Facebook’s primary complaint about the series is that reporters allegedly took key points out of context. The only way to credibly make that charge is to provide people with the full context. It’s not enough for the company’s head of research to describe one set of slides; to have an honest conversation about all this, we should all be looking at the same set of documents. If, as Facebook says, the majority of the research shows benign or even positive effects, it should have all the more reason to want us to read them.
To be sure, the people inside Facebook arguing against the documents’ release have compelling points on their side, too. As soon as the files are made public, every tech reporter on earth will scour them in an effort to find angles that the Journal missed, extending the life of the story and perhaps even worsening the damage. Even if there are positive angles to be found within the data, there’s no guarantee that reporters will actually write them. And a narrow-minded focus on these documents crowds out a larger and equally important discussion of why we aren’t demanding similar research out of YouTube, Twitter, TikTok, and all the rest.
Moreover, the company was taken aback by the largely negative response that its Sunday night blog post received, I’m told. (I was one of the people negatively responding. So was Samidh Chakrabarti, Facebook’s just-departed former head of civic integrity efforts, who pointed out the blog post would have been more credible if it had been signed by the actual researchers who did the analysis.)
The blog post by Pratiti Raychoudhury, the company’s head of research, is detailed and thoughtful in the way that it reflects on both the good and bad news in the company’s studies on how young Instagram users feel about themselves after using the app. The data is mixed, and people will draw different conclusions from it. The fact that so many critics dismissed her report out of hand, though, may have made the company reluctant to share more. If this is the response we get, the argument goes, what’s the point?
But none of these complaints is more important than the fact that sharing this data with the public is ultimately the right thing to do. And it will be better for Facebook to share it on its own terms than on Congress’.
And if Facebook really wanted to change perception, it could go a step further. Releasing the Facebook Files quickly is the company’s least-bad option. But the company knows that outside researchers will be skeptical of any findings they contain, because they can’t see the raw data. Even to the extent that the files exonerate Facebook from some criticisms, the underlying data is likely to remain under a cloud of suspicion.
That’s why, in addition to making the files public, Facebook should share the underlying data with qualified independent researchers in a privacy-preserving way. Let’s get a second, third, and fourth opinion of what the data shows about Instagram and teenagers. Given the recent revelation that political-science data shared with researchers in 2020 was fatally flawed due to a bug, an unexpected gift of important new research material could help the company rebuild trust with researchers.
Not everyone thinks this would be much of a gift: anyone can survey teens about their experiences on Instagram, after all, and among other things an independent study could recruit a larger sample. But to the extent that data in the Facebook Files can’t be easily accessed or replicated by independent researchers, Facebook should share as much as it can. The company’s efforts to share data with researchers to date have been halting and ineffectual. More transparency is coming to the platform one way or another; there’s still value in staking out a leadership position while the rest of the industry cowers.
I say release the Facebook Files for short-term goodwill, and release at least some of the data to qualified researchers for long-term credibility. Since it was founded, Facebook has relentlessly analyzed our actions and behavior, to its great benefit. However unjust it may feel today, it’s only fair that the company now take its turn under the microscope.
⭐ Our constitutional crisis is already here. In a chilling, must-read essay, Robert Kagan outlines very real scenarios in which our democracy could collapse in 2024. Read the whole thing:
The result is that even these anti-Trump Republicans are enabling the insurrection. Revolutionary movements usually operate outside a society’s power structures. But the Trump movement also enjoys unprecedented influence within those structures. It dominates the coverage on several cable news networks, numerous conservative magazines, hundreds of talk radio stations and all kinds of online platforms. It has access to financing from rich individuals and the Republican National Committee’s donor pool. And, not least, it controls one of the country’s two national parties. All that is reason enough to expect another challenge, for what movement would fail to take advantage of such favorable circumstances to make a play for power?
Google appealed the $5 billion fine levied against it by the European Union in an antitrust case. A verdict won’t arrive for months. (Sam Schechner and Daniel Michaels / Wall Street Journal)
Apple and Google bowed to Russia’s demands to remove a voter guide from the app store using new laws that enable the country to physically threaten employees. The use of old-fashioned thug tactics heralds a dark new turn in platform governance; I hope this issue gets more attention. (Justin Sherman / Wired)
Most voters want to curb the power of big tech, according to a new poll. “80% of registered voters — 83% of Democrats and 78% of Republicans — agreed the federal government “needs to do everything it can to curb the influence of big tech companies that have grown too powerful and now use our data to reach too far into our lives.” (John D. McKinnon / Wall Street Journal)
A look at the radicalizing role Telegram played in the run-up to the recent German election. The app “is fast becoming a hotbed for far-right voters, people peddling conspiracy theorists and extremists promoting hate speech,” critics say. (Mark Scott / Politico)
The prime minister of Cambodia is Zoom-bombing opposition meetings. In retrospect it’s surprising that Trump never tried this. (Bopha Phorn and Shaun Turton / Rest of World)
⭐ Facebook is spending $50 million to build a more responsible metaverse. The XR Programs and Research Fund will support partnership with Women in Immersive Tech, Africa No Filter, Electric South, and the Organization of American States, among others. Heres’ Mitchell Clark at The Verge:
The announcement also gives us Facebook’s definition of the sometimes nebulous word “metaverse.” The company describes it as “virtual spaces where you can create and explore with other people” that you’re not physically with, spread out over a variety of products and services. Facebook says the fund’s goal is to make sure it builds its part of the metaverse with an eye towards compatibility with other services, as well as inclusivity, privacy, safety, and “economic opportunity.” Right now, Facebook’s metaverse biggest metaverse program is a platform called Horizon, which exists as a beta Oculus app that lets people have VR meetings.
Related: A look at the “metaverse” as a PR and lobbying campaign. (Elizabeth Dwoskin, Cat Zakrzewski and Nick Miroff / Washington Post)
TikTok hit 1 billion monthly average users. A huge milestone for an app that became ubiquitous seemingly overnight. (Todd Spangler / Variety)
Google slashed the amount of revenue it collects from sales in its cloud marketplace from 20 percent to 3. The power of competition! (Jordan Novet / CNBC)
Google is negotiating to bring Instagram and TikTok videos to search results. The apps compete with Google in some ways, though, potentially making a deal difficult to reach. (Sarah Krouse and Kaya Yurieff / The Information)
Snap is offering new filters that teach you how to finger-spell in American Sign Language. Neat! (Kim Lyons / The Verge)
Those good tweets
Talk to me
Send me tips, comments, questions, and Facebook Files: email@example.com.
By your measure, the 'bigness' of a scandal is judged by how much coverage there is. My assertion would be that the amount of coverage is immaterial to the truth of what actually occurred. Just because something is framed – in this case, literally branded – as a big scandal, doesn't mean it actually is. (The preemptive designation "The Facebook Files" is our first clue of melodrama...)
So in this way, I think your comparison to Cambridge Analytica is quite apt – as measured by media coverage, that was a huge scandal. But in terms of actual impact? As you mentioned last time, by now it's well-understood that most of what was reported was vastly overblown. *Vastly*.
But your average person doesn't know it was overblown! As a result of the endless coverage, people came to believe that Facebook got Donald Trump elected due to Russian meddling. And that their data had been hacked, sold and stolen. This is the ironic thing about the CA scandal: the entire foundation was constructed on the suggestion that Facebook was reshaping the psychology of the nation...when it was that very reporting that (mis)shaped the public's understanding of what actually occurred!
So can we trust those same institutions this time around? It's unclear to me how we could. I'm sure there will be a lot of coverage, but I don't think we should take that as any sort of signal.