A visit to TikTok's transparency center

Few companies have ever made their code so available for inspection. How much will it matter?

A visit to TikTok's transparency center
(Olivier Bergeron / Unsplash)

I.

For all the ways TikTok is different from the other big social apps — cultural dominance, Chinese ownership, rippling bans around the world — its offices turn out to look like any other tech platform’s.

It has the same heavy security presence, the same badged doorways, the same iPads at the entry stationed for visitors to sign in. As you would find at Facebook, or the Twitter of old, there’s an oversized color logo fashioned into a kind of statue, along with LED screens rotating through a series of cheerful branded animations, and conference rooms named after key moments in the platform’s history.

It’s only when the presentation began Tuesday morning at the company’s offices in Culver City, CA, that the differences started to become clear. Since 2017, every social platform has had to muster a response to increasingly skeptical regulators: to questions about misinformation, polarization, privacy and security.

For the American platforms, this has been a costly annoyance: cause to hire tens of thousands of contract content moderators, and to allow executives to be dragged before Congress for a series of theatrical tongue-lashings. (In Europe, where the systems of government are more functional, the pressures have led to more substantial changes.)

For TikTok, though, the scrutiny is existential. This week alone, Australia banned and then un-banned the app on government devices, but an investigation continues. And Sen. Michael Bennet (D-CO) became the latest lawmaker to call on Apple and Google to remove TikTok from app stores, citing fears Americans’ data could be shared with or manipulated by the government of China, where TikTok’s parent, ByteDance, is located.

It is these same fears that have spurred TikTok to invite a group of American journalists to this, its Transparency and Accountability Center. The company first planned to bring us here in 2020, but was delayed by the pandemic. (That September, I took a virtual visit instead.) The idea is to let us hear from, and ask questions of, a small group of company executives, while also working our way through a series of interactive demos that explain how the app and its trust and safety systems work.

The heart of the presentation comes early, when the company explains to us the basic outline of Project Texas: its $1.5 billion plan to move all data attached to American users to the United States, and to put into place various governance, compliance and auditing systems that will keep the company honest.

Under the terms of our visit, we are not allowed to quote from the Project Texas portion of the discussion. At Lawfare, which appears to have attended a similar presentation, Matt Perault and Samm Sacks do a great job describing the various features of the effort.

There will be a new subsidiary of TikTok, called US Data Security, governed by an independent board of directors that is approved by the Council for Foreign Investment in the United States. The leadership team, too, will be vetted and approved by the government. And users’ data will be secured by Oracle with strict access controls. (Oracle will so go far as to compile the final TikTok code and distribute it to app stores.)

Meanwhile, new monitoring systems will be put into place to prevent “state influence” — the word “China” is never used — from corrupting the app.

“We’re incredibly committed to this vision,” Vanessa Pappas, TikTok’s chief operating officer, tells us after the presentation. “You can see that just with the amount of investment we’ve had to date. It is unprecedented. And it is a standard that we believe that all big tech companies should be [held] to, in terms of providing that level of privacy, security and safety for users.”

Like so much of what TikTok says about its security practices, Project Texas seems both totally reasonable and, at least in terms of my ability to evaluate it, above my pay grade. Klon Kitchen, a security researcher at the (hawkish toward China) American Enterprise Institute, said this week that “TikTok is adopting a ‘catch me if you can’ strategy like the one previously employed by Huawei in the United Kingdom, where it strikes the pose of transparency but places the burden on outside reviewers of identifying and pursuing threats.”

Kitchen also noted that the vast scale of software code makes certain promises difficult, if not impossible, to guarantee. “Even if every line of TikTok’s code was initially reviewed and validated, there is simply no way to maintain reliable, real-time situational awareness on a code base this large and that is frequently updated,” he wrote. “A critical vulnerability can be introduced with just a few lines of code and third-party reviewers would be dependent on TikTok disclosing all changes—something we cannot count on.”

TikTok’s position with Project Texas is basically: if you don’t trust us, then trust Oracle. Critics say: you can’t design a workable system here, period.

You can imagine TikTok reading this and saying, well, what else do you want from us? We’re showing you the source code! Ultimately, any deal it strikes with the government will rely to some degree of trust between the parties. But who in the government is prepared to trust TikTok?