Congress reads the Facebook Files

And the company turns against its researchers

Sen. Richard Blumenthal (D-CT) delivers opening remarks Thursday during a hearing on Facebook and children's online safety in Washington, DC. (Tom Brenner / Getty Images)
Sen. Richard Blumenthal (D-CT) delivers opening remarks Thursday during a hearing on Facebook and children's online safety in Washington, DC. (Tom Brenner / Getty Images)

On Thursday morning, we witnessed the rarest sort of Congressional hearing about social media: a productive one. In just over two hours, the Senate Commerce Committee quizzed Facebook’s head of safety, Antigone Davis, about recent reporting in the Wall Street Journal about internal research on Instagram’s effects on teenagers. The cumulative impression given by members’ questions, and Davis’ answers, is of a company losing its grip on the narrative.

The old story was: social networks can have both positive and negative effects, but at Facebook on balance the effects are mostly positive, and the company is working hard to reduce any harms it finds.

The new story is: Facebook knew much more about these negative effects than it ever disclosed, it ignored or undermined the researchers who discovered them, and its apps now appear to be contributing to an array of mental health issues, particularly for younger users.

That new story has now been told in the Journal, in the internal research that the paper published yesterday, in the thousands of documents a whistleblower shared with Congress, and in the questions and statements senators made on Thursday.

“This conclusion is not solely one report, or one Facebook employee’s perspective,” Sen. Richard Blumenthal (D-CT) said at Thursday’s hearing. (TechCrunch has a nice blow-by-blow recap.) “It is a pattern of findings repeated across sophisticated and extensive studies that Facebook itself conducted over the past four years.”

The hearing was productive in large part because it remained focused on those findings — which, again, were Facebook’s own.

Before all this, it was easier to take a more muddled view of the effects of social networks on human behavior. The internal and external research has in fact been mixed. Causal relationships are difficult to determine, even for an organization with Facebook’s data and financial resources. There’s something suspiciously deterministic about a statement like “Instagram makes us depressed.”

It’s a subject almost everyone has an opinion about. But very few have evidence for their beliefs.

This is why the leaked research poses such a significant threat to the company. It reveals the extent to which concerns about apps’ effects on mental health were shared by employees — and also to which they were not taken seriously.

This post is for paying subscribers