From the start, the biggest question hanging over the head the Oversight Board has been whether it can be seen as legitimate. The board — an independent group of experts established by Facebook to adjudicate difficult questions about speech and moderation on the platform — represents a novel effort to build a kind of judicial branch for Facebook. But nearly three years after it was first conceived, the board took its first cases only last month. None has yet to be decided.
In the time since the board first convened, though, the mother of all cases fell into Facebook’s lap. In the wake of the insurrection at the Capitol, Facebook joined other companies in suspending Donald Trump from its platform. Facebook said Trump could not access his account while still in office, but left open the question of whether the suspension was permanent. Earlier this week, it said it had “no plans” to bring Trump back. But plans, as we know, can change.
And on Thursday they did. Here’s Facebook’s Nick Clegg:
Today, Facebook is referring its decision to indefinitely suspend former US President Donald Trump’s access to his Facebook and Instagram accounts to the independent Oversight Board. The board was established last year to make the final call on some of the most difficult content decisions Facebook makes. It is an independent body and its decisions are binding — they can’t be overruled by CEO Mark Zuckerberg or anyone else at Facebook. The board itself is made up of experts and civic leaders from around the world with a wide range of backgrounds and perspectives.
We believe our decision was necessary and right. Given its significance, we think it is important for the board to review it and reach an independent judgment on whether it should be upheld. While we await the board’s decision, Mr. Trump’s access will remain suspended indefinitely. We look forward to receiving the board’s decision — and we hope, given the clear justification for our actions on January 7, that it will uphold the choices we made. In addition to the board’s determination on whether to uphold or overturn the indefinite suspension, Facebook welcomes any observations or recommendations from the board around suspensions when the user is a political leader.
The board agreed to take the case more or less immediately, saying it would be referred to a panel "over the coming days.” Facebook could have asked for an expedited review, but didn’t, meaning the board can take a full 90 days to deliberate. “Members will decide whether the content involved in this case violated Facebook’s Community Standards and values,” the board said in a statement. “They will also consider whether Facebook’s removal of the content respected international human rights standards, including on freedom of expression and other human rights.”
Evelyn Douek, an expert in online speech regulation who made a persuasive case that Facebook refer the Trump matter to the board earlier this month, walked through the implications of the board taking the case in a new post that you should very much read here. Among other things it’s very good on the process by which the board will seek to answer the question of whether Trump should be re-platformed.
The board currently has 20 members, but only a five-person panel will initially consider the case. Unlike an appellate court panel, though, the names of the panelists will not be released publicly. Four will be assigned at random. At least one panel member will be from the U.S.: as every panel must have a representative from the region the decision primarily affects and, while the region technically includes Canada, the board does not currently have any Canadian members. After reviewing the information about the case, the panel will draft a written decision, which may include any concurring or dissenting views if the panel cannot reach consensus.
The 20-member board as a whole will then review this panel’s draft decision. If the board is not satisfied with the decision, a majority of the members can decide to send the case back for re-review, convening a new panel and starting the process again on an expedited timeline. There is currently no mechanism to break the infinite loop that may occur if a majority of the board is consistently dissatisfied with panel decisions.
The fact that such a high-profile and consequential case could nonetheless end in a stalemate speaks to the board’s immaturity. Given the stakes, and the board’s desire to be seen as legitimate, it’s hard to imagine that it won’t reach a decision here. But the fact that it might not speaks to some of the anxieties that other academics have had about this move.
“I think it is a huge gamble with the public sense of FBOB legitimacy, especially inside the US,” said Daphne Keller, director of the Program on Platform Regulation at Stanford's Cyber Policy Center, when I asked. “Personally, if I were invested in FBOB, I would not put them at risk like that when public opinion is still so unformed. But maybe it'll be a winning bet.”
Kate Klonick, who followed the development of the board as an observer, made a similar point on Twitter. “There are going to be a lot of ‘this is a Marbury v. Madison moment’”-type takes, said Klonick, an assistant professor at St. John’s University School of Law. “The Board can establish its seriousness and jurisdiction/power over FB. That could be good for the Board but it also means that it's very risky for establishing legitimacy, esp. so early in its history.”
Klonick added: “But not sending it also would have also been a damning message — that the Board's authority was limited and that FB didn't really intend to give it any hard questions.”
And there’s the rub. Like so many questions about platforms, this one is ultimately about power. At the moment, one person — Mark Zuckerberg — has the final authority over every decision made about what speech belongs on Facebook. Many people at Facebook think this position is untenable — including Zuckerberg, which is why he supported and funded the Oversight Board.
But just because Zuckerberg won’t make this decision doesn’t mean the board’s decision will be better. It’s possible that Zuckerberg would make a decision more consistent with American values than the largely international board, for example. Because we’ve never seen the board decide a case, we have no insight into its members’ beliefs or their juridical style. It’s all very opaque.
And so I run scenarios in my head.
Say the board affirms Facebook’s initial decision and says Trump should be kept off the platform. This would likely come as a relief to many American users. But it could create the perception that the board is a rubber stamp.
Then there are the Americans who are upset by the decision to remove Trump — a group that might include the majority of daily Facebook users in the United States. Will Facebook’s standing improve with those users because the company put the question to a group of experts first? It seems unlikely.
More interesting is the question of what will happen if the board rules that Trump should return. Many people who replied to me on Twitter today suggested that they assume this will happen, and that it is in fact what Facebook executives secretly want — whether to drive engagement, sell Trump ads, or some other dark motive. Just as in the opposite case, the decision will prompt some to question the board’s legitimacy. Instead of a rubber stamp here the board will be seen as an entity through which Facebook launders its most controversial choices.
Ultimately, the board’s legitimacy in the public imagination can’t help but be tied to Facebook’s. At a moment when trust in Facebook is relatively low, at least in the United States, a brand-new board created by the company is also likely to suffer from public doubt. Executives believe that the board will gain legitimacy over time as it issues opinions the company disagrees with, and Facebook abides by them anyway.
I’ve been a mostly enthusiastic supporter of the board’s creation, because I think that if we are going to have platforms with billions of users, those users deserve something akin to a judicial system — to justice. An effective Oversight Board won’t feel legitimate right away, but I believe it can build legitimacy over time, primarily by considering cases carefully and explaining its decisions clearly.
First, though, it has to get the Trump case right — whatever that means. For the board, and maybe the world, the stakes could scarcely be higher. And the policy guidance it gives Facebook on how to treat the speech of political leaders more broadly, even when that speech is abhorrent, should have repercussions far beyond the United States. (Trump is not the first world leader to use the platform to threaten his own citizens — not by a long shot.)
If I were on the board, I’d vote to affirm Facebook’s decision to remove Trump’s account. It was a malign presence in the world, and we’re better off without it. Until recently, by creating various exceptions for them, social networks have held world leaders to a much lower standard than they have average citizens. I suspect the board could find favor by arguing — and by ruling — that leaders should be held to a higher one.
Today in news that could affect public perception of the big tech companies.
⬇️ Trending down: Democratic members of the House of Representatives called on Facebook, Twitter, and YouTube to make sweeping changes in response to the January 6 Capitol insurrection. “For YouTube, lawmakers said they would like to see the company disable auto-play and stop recommending any conspiratorial content alongside videos or on users’ homepages.” (Makena Kelly / The Verge)
⭐ Twitter locked the Chinese embassy out of its account after a tweet about its treatment of the Uighur population. The move illustrates Twitter’s increasingly bold enforcement of its policies against state-level actors. Kurt Wagner and Peter Martin had the story at Bloomberg:
Twitter has locked the official account for the Chinese Embassy to the U.S. after a post that defended the Beijing government’s policies in the western region of Xinjiang, where critics say China is engaged in the forced sterilization of minority Uighur women.
The tweet, which said Uighur women were no longer “baby-making machines,” was originally shared on Jan. 7, but wasn’t removed by Twitter until more than 24 hours later. It has been replaced by a label saying, “This tweet is no longer available.” Even though Twitter hides tweets that violate its rules, it still requires the account owner to manually delete the post in order to regain access.
Joe Biden named Jessica Rosenworcel as acting chief of the Federal Communications Commission. The 49-year-old lawyer is just the second woman interim chief of the FCC in history. (April Glaser / NBC)
A look back at Ajit Pai’s most harmful actions as head of the FCC. Killing net neutrality leads the list, but he was a reliable source of giveaways for his old bosses in the telecom industry. (Matthew Gault / Vice)
A court ruled that Amazon doesn’t have to provide service to Parler. “U.S. District Judge Barbara Rothstein in Seattle said Parler failed to show it was likely to prevail on the merits of its claims.” (Reuters)
“Parler could expose its users to Russian surveillance if the site someday does relaunch in full with DDoS-Guard,” according to this analysis. The Russian company needs access to all traffic flowing through Parler servers to defend against denial-of-service attacks, putting user data at risk. (Lily Hay Newman / Wired)
Here’s a database of 7.6 million tweets related to election fraud. Go nuts, kids. (VoterFraud2020)
Steve Bannon broadcasts election denialism and apocalyptic calls to action several times a day via Apple’s podcast app, prompting some calls for Apple to remove him from its directory. But the company has been reluctant to intervene so far. (Lydia DePillis / ProPublica)
Using facial recognition technology, amateurs are combing through downloaded Parler archives to develop a fuller picture of who was present at the Capitol insurrection. The large amount of imagery posted on the site has allowed investigators to find people who did not post to Parler themselves. (Joseph Cox / Vice)
Some of the key documents of the Trump era are proving difficult to archive, presenting challenges for future historians. Deleted posts, disappearing websites, and the private nature of much social media is proving a challenge for historians. (Celeste Kaufman / Daily Dot)
Meet the new tech bureaucracy. Capsule biographies of some of Biden’s top nominees to posts that could have an influence on Silicon Valley, from the Department of Justice to the Office of Science and Technology Policy. (Emily Birnbaum Issie Lapowsky / Protocol)
China is accelerating efforts to spread disinformation about Western vaccines and spread conspiracy theories about COVID-19. “The reports, claims by officials and unchecked online speculation this month appear to be part of a renewed Chinese push to cast blame for the pandemic elsewhere and undermine public confidence in Western vaccines.” (Gerry Shih / Washington Post)
China proposed new measures to increase competition among its tech giants, threatening companies with more than 50 percent market share with antitrust probes. The move could prove disruptive to companies like Ant Group and Tencent. (Bloomberg)
⭐ Beeper is a universal chat app — combining iMessage, Instagram, Slack, and other products — that costs $10 a month. A universal inbox sounds great to me! Here’s Jon Porter at The Verge:
Although Beeper integrates with world’s most popular messaging services like WhatsApp, Signal, Telegram, Slack, Twitter, Discord, Instagram, and Facebook Messenger, it’s the support for Apple’s iMessage that’s perhaps most interesting. iMessage is only officially available on Apple devices, and it’s often cited by users as something that prevents them switching to Android. Migicovsky says Beeper should allow iMessage to work on Android, Windows, and Linux, but admits that it’s “using some trickery” in doing so.
Patreon is considering a public listing this year. Business doubled during the pandemic, and the company recently passed $100 million in annualized revenue. (Ross Matican and Amir Efrati / The Information)
An internal Facebook memo tells employees to think about building products with the assumption they will get no access to user data. “The Big Shift,” by longtime executive Andrew “Boz” Bosworth, aims to get employees to minimize the amount of data they collect. (Alex Kantrowitz / Big Technology)
TikTok collects significant data about users even when they do not have accounts, according to this analysis. A GDPR request revealed that the company had obtained more than 350,000 unique data points about the writers even though he never logged in during two months of use. (Riccardo Coluccini / Vice)
Google signed a deal with France to reuse snippets of publisher content in Google search products. It’s unclear how valuable the deal will be to the average publisher, though. (Natasha Lomas / TechCrunch)
Those good tweets
Talk to me
Send me tips, comments, questions, and your public comment on the Trump case: firstname.lastname@example.org.