Twitter's other porn problem

Imagine the company did want to monetize adult content. Would Apple let it?

Twitter's other porn problem
(Jeremy Bezanger / Unsplash

Programming note: I think I’m gonna have one more scoop for you this week, so I’m planning to take Thursday off the newsletter and hit you with a bonus edition on Friday.

I.

Yesterday Zoe Schiffer and I reported on Twitter’s delayed plans to roll out a competitor to OnlyFans, letting the platform’s many adult creators sell subscriptions to their work while taking a cut of the revenue. The company ultimately paused the project over problems identified by a red team it had convened: namely, it found, “Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale.”

Today, let’s talk about some of what we heard after we published.

One, the red team rather conspicuously left out of its report an issue that had been raised to management as it considered the Adult Content Monetization (ACM)project, a former employee told me. The issue: that Twitter’s move would draw Apple’s attention to the significant amount of porn on the service, and potentially place new restrictions on its iOS app or even remove it from the App Store.

These are not merely theoretical concerns. Last year Tumblr had to begin censoring a long list of hashtags that users add to posts, including “submission,” “girl,” and “sad,” amid concerns from Apple that those tags could lead users to “sensitive content.” Critics at the time called the list over-broad to the point of absurdity, but Tumblr had little choice but to comply. Its app was temporarily removed from the App Store in 2018 after child sexual abuse material (CSAM) was found on the platform.

A month later, Tumblr banned adult content altogether.

That puts the stakes of Twitter’s ACM project into stark relief. Researchers at the company say Twitter is already struggling to remove existing CSAM from the platform — something that could potentially trigger an existential ban from the App Store.

But the bigger question might be why Twitter is allowed in the App Store despite the fact that the adult content on the app would seem to be in violation of Apple’s existing policies. App Store guideline 1.14 bans “overtly sexual or pornographic material, defined as ‘explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings.’”

And there’s plenty of that on Twitter.

Of course, porn remains a minority use case for Twitter. (The core use case remains insulting people who have more followers than you.) And Twitter is recommended only for those 17 and older in the App Store, in an effort to keep kids away. But porn is very much a fact of Twitter, and I think Twitter will absolutely risk getting booted from the App Store should it pursue ACM in the future.