Can platforms outsmart Texas's social media law?

One idea for how they could try. PLUS: Announcing Platformer's new managing editor, Zoe Schiffer

Can platforms outsmart Texas's social media law?
“A map of Texas full of megaphones, digital art” / DALL-E

Today, let’s talk about the effort to rewrite decades of First Amendment jurisprudence in Texas, in ways that could flood tech platforms with garbage. There’s still hope that the Supreme Court could rule the state’s controversial HB20 unconstitutional — but if it doesn’t, there’s some interesting new thinking around how platforms might be able to get around it anyway.

I’ve written a few times here about HB20, which allows the state’s attorney general and average citizens to sue platforms with more than 50 million users any time they believe the platform removed a post due to the viewpoint it expresses. In May, the US Supreme Court temporarily blocked the law from taking effect while it was appealed. But just over a week ago, the Fifth Circuit Court of Appeals overturned a lower court’s ruling and allowed the law to take effect. The case is now almost certainly headed for the Supreme Court.

By now there has been a lot of good commentary about the Fifth Circuit’s decision: about its obvious factual errors; its willful misunderstanding of Section 230; and its stated belief that Nazi and terrorism content posted to social networks is a “hypothetical” issue. (Mike Masnick did yeoman’s work on this front in a piece for the Daily Beast.)

And there have been several probing legal analyses, delivered in epic-length Twitter threads, pointing out the various confusions and contradictions within the law and the judges who upheld it: the sudden conservative antipathy toward private property rights, for example, and the potentially unconstitutional nature of its transparency requirements. (One terrible aspect of the Texas law among many is that it would prevent platforms from updating their rules more than once every 30 days, even if something horrible starts happening on the platform in the meantime that requires a policy adjustment.)

There have also been some fairly dramatic twists. Texas told the court that the law applied only to Meta, YouTube, and Twitter; in fact, platforms with more than 50 million monthly users in the United States include Wikipedia, Quora, Microsoft Teams, and iMessage. No one really knows what it would mean to force Wikipedia to carry every single political viewpoint; few are excited to find out.

If upheld by the Supreme Court, it’s unclear how big tech platforms would or would not comply with the law; they have so far mostly declined comment on the subject. It seems hard to imagine a world where Facebook has to leave up a bunch of pro-Nazi posts for fear of being sued by a random Texas citizen, and yet the law seems to grant it no discretion to do otherwise.

And yet despite the high stakes here, I find myself with relatively little to say on the subject. Not because it isn’t a huge deal — it is — but because none of the analysis or commentary really matters in a world where federal appeals courts are making laws up off the top of their heads. It has been less than 18 months since Justice Clarence Thomas essentially begged states to write laws like the ones passed in Texas and Florida this year; now those laws have landed in his lap, and all that remains is to see whether he can persuade four others that large technology platforms aren’t entitled to editorial discretion or any of the other speech rights that private corporations have long enjoyed.

That said, there are least two ideas worth considering as we await the results of this potential free-speech calamity. One was advanced by the platform regulation scholar Daphne Keller on the first episode of Moderated Content, a new podcast from content moderation expert Evelyn Douek at Stanford. On the show, Keller wonders whether platforms might be able to get around the most vexing parts of the Texas law by allowing users to opt in to content moderation: showing them all the abuse and hate speech and everything else by default, but letting them click a button that restores the community guidelines and the regular platform experience.

“The middle ground I'm most interested in … is to flood Texas with garbage by default, but give users an opt out to go back to the moderated experience,” Keller said. “And there's some language in the statute that kind of, sort of, arguably makes that okay. And it sort of illustrates the problem with the Texas law by flooding everyone with garbage by default, while avoiding a bunch of the bad consequences of actually flooding everybody with garbage permanently.”

I hope it doesn’t come to that. But if platforms really are put in this position, at the very least it seems like an idea worth exploring. Among other things, it would provide some valuable signal about an idea that I think very few people understand: that there is market demand for content moderation, and that a significant majority of users want platforms to remove harassment, abuse, anti-vaxx material, and so on.

Meanwhile, average platform users are thinking through solutions of their own. Over at TechDirt, Masnick has the amazing story of the r/PoliticalHumor subreddit, which is now requiring users to add the phrase (Texas Gov.) “Greg Abbott is a little piss baby” to every comment or else see their post deleted. Redditors are also directing users to a page where people can file a complaint with the Texas Attorney General, Ken Paxton, “asking him to investigate whether the deletion of any comments that don’t claim that his boss, Governor Greg Abbott, is ‘a little piss baby’ is viewpoint discrimination in violation of the law.”

The idea is to force Paxton to file a lawsuit defending the rule that all comments must call the governor a piss baby, and while I imagine that he will decline to do so, if nothing else the stunt demonstrates the absurdity of passing a law requiring “viewpoint neutrality” in the first place.

When I wrote about HB20 in May, I said that “the future of content moderation feels like it’s all about to come down to a coin flip, and I’m not sure anyone is fully prepared for what could come next.” Now that coin is up in the air, and I’m not feeling any more ready for the consequences than I did four months ago.


Announcing: Zoe Schiffer

Last week I told you that, as part of some changes to Platformer in year three, I hired someone. Today, I’m thrilled to tell you who that person is: Zoe Schiffer, a senior reporter at The Verge, is coming on board as Platformer’s managing editor.

I first began working with Zoe in 2019 when The Verge hired her to help me put together the predecessor to this newsletter each day. Over the next year-plus, we developed an excellent rhythm: Zoe summarized the day’s links before going to work on her own scoops, and I’d use the extra time to put more reporting and analysis into the newsletter. When I left to start this newsletter, losing Zoe as a partner was one of the hardest adjustments I had to make.

In the years since, though, Zoe has demonstrated time and again that she’s one of the most compelling reporters in tech: delivering path-breaking scoops about labor issues inside Apple, Google, and Netflix, among other companies. Recently we collaborated on stories about Twitter’s canceled OnlyFans competitor and discarded plans to combat extremism.

That’s why I’m so excited to team up again. As managing editor, Zoe will have a hybrid editorial and operational role. She’ll be collaborating with me on each day’s newsletter, reporting out stories of her own, and working behind the scenes to grow Platformer’s audience. And while I’ll continue to take the lead on the main column each week, expect to see Zoe’s byline here as well.

Like so much of what I’ve tried in this space over the past two years, hiring a partner is an experiment. My hope is that it will lead to more scoops, sharper analysis, and a bigger audience for what we do. (To that end, I’m also excited to share that Platformer will continue to syndicate a weekly column in The Verge.)

Zoe starts next week. What should we investigate? Reply to this email with your ideas.

And if you haven’t yet subscribed, there’s never been a better chance to signal your commitment to independent journalism:


Governing


Industry


Those good tweets


Talk to me

Send me tips, comments, questions, and assignments for Zoe and me: casey@platformer.news.