The Facebook whistleblower testifies

What Frances Haugen gets right — and wrong

The Facebook whistleblower testifies
Facebook whistleblower Frances Haugen testifies during a Senate hearing Tuesday in Washington, DC. (Jabin Botsford-Pool / Getty Images)

Today let’s talk about Facebook whistleblower Frances Haugen’s testimony before the Senate: the good, the bad, and what ought to happen next.

For more than three hours on Tuesday, Haugen addressed a subset of the Senate Commerce Committee. She appeared calm, confident, and in control as she read her opening remarks and fielded questions from both parties. While she brought more nuance to her critique than most Facebook critics — she supports Section 230, for example, and opposes a breakup of the company — she also said the company should declare “moral bankruptcy.”

“This is not simply a matter of certain social media users being angry or unstable, or about one side being radicalized against the other,” Haugen told Congress. “It is Facebook choosing to grow at all costs, becoming an almost trillion dollar company by buying its profits with our safety.”

The Senate largely ate it up. Long frustrated by Facebook’s size and power — and, one suspects, by its own inability to address those issues in any constructive way — senators yielded the floor to Haugen to make her case. Though it was titled “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms,” Haugen ultimately walked senators through most of the Wall Street Journal’s Facebook Files, touching on ethnic violence, national security, polarization and more during her testimony.

For their part, senators sought to paint the hearing in historic terms. There were repeated comparisons to Big Tobacco, and a “Big Tobacco moment.” “This research is the definition of a bombshell,” said Sen. Richard Blumenthal (D-CT), who led the hearing.

Over at Facebook, the strategic response team lobbed a half-hearted smear at Haugen, noting bizarrely that while at the company she “had no direct reports” and “never attended a decision-point meeting with C-level executives.” If there’s a point in there, I missed it.

Ultimately, Haugen said little on Tuesday that wasn’t previously known, either because she said it on 60 Minutes or it was previously covered in the Journal series.

What she might have done, though, is to finally galvanize support in Congress for meaningful tech regulation.

Let’s walk through Haugen’s testimony.

The good parts

One, Haugen identified real harms that are taking place on Facebook services. For example, she talked about documents which indicate that using Instagram can contribute to eating disorders in some teenagers. Too often, discussions about the harms of social networks is either abstract or emotional. The primary benefit of Haugen’s leaking is to bring some empirical rigor to those discussions — and to highlight the degree to which these issues are known, but not discussed, by Facebook executives. That’s powerful.

In response, Facebook’s Monika Bickert told CNN that the same research shows that the majority of teenagers find that Instagram improves their well being. But one of the hearing’s most powerful moments came when Haugen noted that only about 10 percent of cigarette smokers ever get cancer. “So the idea that 20 percent of your users could be facing serious mental health issues, and that's not a problem, is shocking,” she said, citing leaked data.

Two, Haugen highlighted the value of research in understanding problems and crafting solutions. For years now, we’ve watched Congress interrogate Facebook based on spurious anecdotes about who was censored or shadow banned, or what publisher was or wasn’t included on a list of trending topics, to no constructive end.

It was refreshing, then, to see members of Congress wrestling with the company’s own internal data. Sen. Ted Cruz, rarely seen operating in good faith on any subject, largely set aside his questions about censorship to ask Haugen about data exploring the link between Instagram and self-harm. Facebook will say, not unfairly, that senators were largely just cherry-picking with these questions. But we have to ground these discussions in something — why not Facebook’s own research?

Third, and maybe most potently, Haugen helped to shift the discussion of platform problems away from the contents of the speech they host to the design of the systems themselves. “The problems here are about the design of algorithms — of AI,” Haugen said, in response to a question about whether the company should be broken up. That wouldn’t solve anything, she said — the same engagement-based algorithms would likely create similar issues within the new baby Facebooks.

Haugen posited regulation of algorithms — specifically, banning engagement-based ranking like Facebook and Instagram use today — as a way to avoid the First Amendment issues that come with attempting to regulate internet speech. As the scholar Daphne Keller has written, attempting to regulate speech algorithms will likely trigger First Amendment scrutiny anyway.

Still, Congress seemed receptive to the idea that it ought to focus on broader system incentives, rather than stunts like the recent efforts in Florida and Texas to force platforms to carry all speech regardless of content. The details get tricky, but that shift would be a welcome one.

The trouble spots

For all its positive aspects, Haugen’s testimony had some unfortunate aspects as well.

One, Haugen came across as a solutionist: someone who believes that any problem created by tech can therefore also be solved by tech. This comes across most strongly in her advocacy for a reverse-chronological feed, which she argues would remove incentives to share polarizing or harmful content.

It seems possible that this is true, but only marginally. Polarizing and harmful content was often shared on Twitter and Instagram during the many years that those services used reverse-chronological feeds. That’s not to say reducing algorithmic amplification is a bad idea, or that Facebook shouldn’t research the issue further and share what it finds. But given the broad range of harms identified in the Facebook Files, I found it surprising that Haugen’s pet issue is feed ranking: I just don’t believe it’s as powerful others seem to.

My second, somewhat related concern is that Haugen’s testimony had tunnel vision. Those of us who opine about social networks are forever at risk of attempting to solve society-level problems at the level of the feed. To avoid that, we have to bring other subjects into the conversation. Subjects like, how the US was growing polarized long before the arrival of social networks. Or the research showing that long-term Fox News viewership tends to shift people’s political opinions more than Facebook usage. Or the other reasons teenagers may face a growing mental health crisis, from growing inequality and housing insecurity to the threat of climate change.

It’s possible to consider a subject from so many angles that you find yourself paralyzed. But it’s equally paralyzing to begin your effort to rein in Big Tech with the assumption that if you can only “fix” Facebook, you’ll fix society as well.

Finally, Haugen’s testimony focused on the documents, rather than her own work at Facebook. I can’t have been alone in wanting to hear more about her time on the Civic Integrity team, or later working in counter-espionage. But senators were more interested in the admittedly fascinating questions raised by the research that she leaked.

That’s understandable, but it also meant that Haugen had to regularly remind the subcommittee that they were asking her questions in which she did not have expertise. In my own talks with current Facebook employees, this is the point on which I hear the most exasperation: just because you found some documents on a server, they tell me, doesn’t mean you are qualified to describe the underlying research.

There’s an obvious fix for that — summon more qualified employees to testify! But in the meantime, I wish Haugen had taken more opportunity to discuss what she saw and learned with her own eyes.

What should happen next

Platforms should take the events of the past few weeks as a cue to begin devising ways to regularly share internal research on subjects in the public interest, annotated with relevant context and with data made available to third-party researchers in a privacy-protecting way. Facebook regularly tells us that most of its research shows that people like it, and the company’s market dominance suggests there is probably evidence to back it up, too. The company should show its hand, if only because soon enough governments will require it to anyway.

Congress should pass a law requiring large platforms to make data available to outside researchers for the study of subjects in the public interest. (Nate Persily argues here that the FTC could oversee such a design.) I think sharing more research is in Facebook’s long-term self-interest, and that the company ought to do so voluntarily. But to get an ecosystem-level view, we need more platforms to participate. Unless we want to rely on whistleblowers and random caches of leaked documents to understand the effects of social networks on society, we should require platforms to make more data available.

What Congress should not do is pass a sweeping law intended to solve every problem hinted at in Haugen’s testimony in one fell swoop. Doing so would almost certainly curtail free expression dramatically, in ways that would likely benefit incumbent politicians at the expense of challengers and marginalized people. Too many of the bills introduced on these subjects this year fail to take that into account. (Unless they are taking it into account, and quashing dissent is their ulterior motive.)

Instead, I’d like to see Congress do a better job of naming the actual problem it’s trying to solve. Listening to the hearing today, you heard a lot of possibilities: Facebook is too big. Facebook is too polarizing. Facebook doesn’t spend enough on safety. Facebook is a national security risk. There still appears to be no consensus on how to prioritize any of that, and it’s fair to wonder whether that’s one reason Congress has had so much trouble advancing any legislation.

In the meantime, right or wrong, Haugen appears to have persuaded Congress that Facebook is as bad as they feared, and that the company’s own research proves it. Simplistic though it may be, that narrative — Facebook is bad, a whistleblower proved it — is quickly hardening into concrete on Capitol Hill.

The question, as ever, is whether our decaying Congress will muster the will to do anything about it.

Elsewhere: Congress threatens to summon Mark Zuckerberg, too. Teenage girls say Facebook’s internal research is right about the pressures that Instagram creates. The limits of the Big Tobacco comparison. What one of Haugen’s SEC complaints has to say about QAnon.


Governing

Snapchat rolled out a tool for helping young adults run for public office. Just a nice little pro-democracy idea here. Here’s Aisha Malik at TechCrunch:

The tool includes a centralized portal that curates over 75,000 upcoming elections on the federal, state and local levels that users may be eligible to take part in. Users will also be asked to identify issues that they are passionate about so the tool can surface roles that they may be interested in. The new tool also provides access to candidate recruitment organizations and training programs such as Emerge America, Women’s Public Leadership Network, LGBTQ Victory, Vote Run Lead and more.

Snapchat users will also be able to nominate their friends to run for office through the tool. Users will be able to share stickers through the tool to start campaigning on Snapchat with their friends. Further, the tool includes a personalized campaign hub that shows users the steps they need to take to get on the ballot. The hub also provides information about filing deadlines and signature requirements.

A look at how Facebook could offer access to outside researchers, overseen by the Federal Trade Commission. The author proposes data “clean rooms” inside company offices that would allow researchers to explore user data in limited ways. (Nathaniel Persily / Washington Post)

Russia threatened to fine Facebook after saying it had not removed “illegal” content quickly enough. The posts may have included “calls for minors to participate in anti-government protests.” (Reuters)

An antitrust case against Apple moved forward in China. As in other countries, the plaintiff here is seeking to force Apple to accept payments outside the App Store. (Josh Ye / South China Morning Post)

Singapore passed significant new restrictions on online discussion in a bid to further crush public debate. The law “would allow authorities to compel Internet service providers and social media platforms to provide user information, block content and remove applications used to spread content they deem hostile.” (Rachel Pannett / Washington Post)


Industry

Sky Mavis, developer of an increasingly popular, cryptocurrency-based video game called Axie Infinity, raised at a valuation near $3 billion. Andreessen Horowitz led the $150 million round. Here are Kate Clark and Hannah Miller at The Information:

The valuation for the three-year-old startup reflects the tremendous growth in sales on Axie Infinity, an Internet-based multiplayer game that uses non-fungible tokens, the one-of-a-kind digital assets authenticated on the blockchain. The game is on track to generate $1 billion in revenue from in-game transactions this year, about 17% of which Sky Mavis keeps, the developer’s co-founder earlier told The Information. […]

Similar to Nintendo’s Pokémon, users of Axie Infinity battle, trade and breed cartoon pets called Axies. Unlike with Pokémon, however, the pets are unique, blockchain-based NFTs. Players can also earn crypto tokens that are customized to the game and then exchange them for other currencies.

A look at the effect of WhatsApp going down in countries around the world where it is a primary mode of communication. Among other effects, downloads of Signal soared. (Amy Cheng / Washington Post)

Related: Telegram saw 70 million downloads yesterday, its founder said. (Pavel Durov)

Plus: Facebook offers a detailed account of what went wrong.

Mark Zuckerberg’s personal wealth fell $6 billion in the hours after Monday’s Facebook outage. It’s part of a nearly $19 billion slide in the past few weeks. (Scott Carpenter / Bloomberg)

Instagram began to unwind IGTV. After three years of false starts, Instagram is bringing longform video to the main feed and eliminating the IGTV brand. (Jacob Kastrenakes / The Verge)

Google’s AI division, DeepMind, turned a profit for the first time. “The £46m pre-tax profit comes after years of mounting losses since Google acquired DeepMind in 2014 for about £400m, and a further £1.1bn debt that parent company Alphabet wrote off in 2019.” (Madhumita Murgia / Financial Times)

Tinder is testing an in-app currency called “coins” in Australia. The idea is to incentivize activity within the app, but I worry it adds a dystopian element to the already fraught realm of online dating. (Michael Tobin / Bloomberg)

Unknown brands and phony reviews are making it increasingly difficult to shop online. “Even with the best research, there’s often no clear answer to the question, what kind of product will I get?” (Heather Kelly / Washington Post)


Those good tweets


Talk to me

Send me tips, comments, questions, and Congressional testimony: casey@platformer.news.