How the Kids Online Safety Act puts us all at risk

Anti-speech laws are spreading from states to Congress, and the future of online speech could hang in the balance

How the Kids Online Safety Act puts us all at risk
Sen. Marsha Blackburn (R-TN), a leading co-sponsor of the Kids Online Safety Act, speaks at a news conference at the U.S. Capitol Building last month. (Anna Moneymaker / Getty Images)

Today let’s talk about a right we currently take for granted that may soon disappear: the ability to browse and post to most websites anonymously.

In May, Utah became the second state to enact a law requiring adult websites to verify the identities of their users. The law requires websites that host porn to use one of several methods to ensure visitors are 18 years or older, including state-issued digital IDs and third-party age verification services. Those services use a range of verification tools, including age estimation based on the user’s face and uploading a government ID.

The law originated from reasonable concerns that most websites do almost nothing to prevent minors from accessing porn online. But as written, it also discourages adults from using these sites — the stigma around watching porn means that very few people want their face or government name linked to their browsing on a site like Pornhub. (The fact that verification companies are supposed to delete user data after verification hasn’t provided much comfort, since how will anyone know if they’re following through?)

To the anti-porn Republican supermajority of the Utah legislature, this broader challenge to adult websites is undoubtedly a happy byproduct of the law. So, too, is the fact that one of the leading adult websites, Pornhub, went dark in Utah in protest of the new law.

This isn’t the first time the US government has sought to restrict access to adult content by requiring websites to verify users’ ages. In 1998, the Child Online Protection Act similarly attempted to enforce age verification schemes. But after 10 years of litigation, the Supreme Court refused to hear the government’s final appeal that the law was unconstitutional.

The reason is that the First Amendment prevents the government from weighing in on what kinds of content might be “harmful to minors.” In particular, courts observed that materials that might harm a young child would not harm a 17-year-old. The one-size-fits-all strategy, the court decided, chilled too much speech.

That outcome should cheer anyone who hopes that the Utah law and others like it are eventually thrown out. But this week, privacy advocates lost their first attempt in district court — and for a disturbing reason.

It turns out that Utah crafted its law using lessons from recent Republican victories in abortion cases. Instead of compelling the state to act against porn providers, Utah’s law empowers private citizens to sue them. The result is what the director of the advocacy group the Free Speech Coalition has called “an end run around the First Amendment.”

And for now, it’s working.

Here’s Sam Metz in the Associated Press:

U.S. District Court Judge Ted Stewart did not address the group’s arguments that the law unfairly discriminates against certain kinds of speech, violates the First Amendment rights of porn providers and intrudes on the privacy of individuals who want to view sexually explicit materials.

Dismissing their lawsuit on Tuesday, he instead said they couldn’t sue Utah officials because of how the law calls for age verification to be enforced. The law doesn’t direct the state to pursue or prosecute adult websites and instead gives Utah residents the power to sue them and collect damages if they don’t take precautions to verify their users’ ages.

If this strategy sounds familiar, it’s because it’s the same one Texas used in its 2021 law to prohibit abortions at around six weeks. The Supreme Court upheld that law, making it a model for state legislatures to curtail other freedoms.

And as anti-porn legislation gains ground, lawmakers are turning their attention to the broader web. Earlier this year the Utah government passed the Utah Social Media Regulation Acts, which require everyone to verify their age before creating an account on a social network. Minors under the age of 18 need parental permission to use social media, and their parents must be granted full access to their accounts.

That’s a blow to the autonomy and privacy rights of older teens. It also puts at risk LGBT minors, who were once able to use social networks to connect with other queer youth in the face of disapproval or abuse from their parents. They’ll now have a much harder time doing so.

But it’s not just teens who will suffer. Adults who are forced to upload their government identification to an endless string of websites will almost certainly one day find themselves victims of a future data breach in which hackers obtain that data and sell it on the open market.

And it’s not just Utah, either. The revived Kids Online Safety Act (KOSA) would take some of the same ideas from state legislation and take them national.

Chief among these ideas is that social platforms must soon take additional steps to protect minors — a requirement that will likely force them to undertake age verification measures similar to Utah’s.

It’s true that after significant pushback, Congress watered down the bill’s verification provisions. But as Mike Masnick notes at TechDirt, the law is written in a vague enough manner that platforms will likely feel compelled to adopt verification schemes anyway. (TechDirt has covered KOSA in great detail since it was introduced, and its coverage is worth perusing.)

And what must platforms do after verifying their users’ ages? KOSA attempts to create a “duty of care,” holding companies responsible for harms that minors may experience after encountering speech on them.

“This duty of care directly requires platforms to protect against the harmful effects of speech, the overwhelming majority of which is constitutionally protected,” wrote Ari Cohn, a First Amendment lawyer at the think tank Tech Freedom, in a letter to senators supporting the bill.

Cohn goes on to explain why we can’t put platforms in charge of determining what might harm a minor:

Platforms cannot “prevent and mitigate” the complex psychological issues that arise from circumstances across an individual’s entire life, which may manifest in their online activity. These circumstances mean that material harmful to one minor may be helpful or even lifesaving to another, particularly when it concerns eating disorders, self-harm, drug use, and bullying. Minors are individuals, with differing needs, emotions, and predispositions. Yet KOSA would require platforms to undertake an unworkable one-size-fits-all approach to deeply personal issues, thus ultimately serving the best interests of no minors.

Again, courts have routinely refused to impose such vague, expansive, and inherently unmeetable duties of care on disseminators of expression, holding that doing so would unacceptably chill First Amendment activity. 18 Indeed, because platforms cannot reasonably anticipate or individually account for the different ways each minor will experience, interact with, or be impacted by content, KOSA’s duty of care “provide[s] no recognizable standard . . . to follow.”

Given how differently individuals respond to different kinds of content, it’s not clear to me how a duty of care would make kids safer. But it seems fairly obvious how it could put them at risk.

The reason: Another plank of the bill would allow state attorneys general to sue platforms if they believe that minors have been harmed. Right-wing groups are already agitating to use this provision to sue platforms for showing minors content involving transgender people. Should KOSA become law, I imagine it will be mere weeks before the attorney general of Texas, Florida, or another red state sues Meta, Google, and other tech giants simply for making trans content available to minors.

Making good internet law requires carefully balancing equities. And I’m sympathetic to the idea that platforms ought to do more to protect young people, particularly around content that may negatively affect their mental health.

But the equities here are way off kilter. Lawmakers are quickly advancing an anti-sex, anti-speech agenda in which every adult user of the internet could soon find themself entangled.

A decade ago, the Supreme Court could mostly be assumed to keep laws like this in check. But the Roberts Court shows increasingly little regard for precedent — and some justices are openly craving retribution against the platforms on speech-related issues.

Over the past half decade, Congress has failed to pass much better pieces of platform regulation. It would be a shame if lawmakers’ first big swing at improving internet safety in years came at the expense at the speech rights of every single person using a platform.


On the podcast this week: By popular demand, we walk through the science of room-temperature superconductors and what they might mean for future technology. Then, Kevin and I debate the finer points of KOSA. And finally, a round of HatGPT.

Apple | Spotify | Stitcher | Amazon | Google


Governing


Industry


Those good tweets

For more good tweets every day, follow Casey’s Instagram stories.

(Link)

(Link)

(Link)


Talk to us

Send us tips, comments, questions, and sensible internet rules: casey@platformer.news and zoe@platformer.news.