Facebook's Supreme Court arrives
How much justice can Facebook's new justice system provide?
Oversight Board co-chairwoman Helle Thorning-Schmidt (Tobias Hase / Getty)
Two and a half years after Mark Zuckerberg floated the idea on a podcast, and several months after organizers said it would be ready to hear cases, Facebook’s independent Oversight Board is now up and running. In a call with reporters today, the board’s co-chairs said that they are now prepared to hear appeals from the billions of people who use Facebook and Instagram each day. The ability to appeal to the board will roll out gradually around the world over the next several weeks. And when it’s complete, one of the bolder experiments in platform governance will have finally begun.
Let’s begin by taking a step back. One of the more striking things about our social networks, I have argued, is how terrible they are at customer service. If your post is taken down in error, or your account is suspended for unclear reasons, or you are banned and lose access to all of your data, historically you have had almost no recourse whatsoever. You fill out a form, you send it in, you pray. Maybe your prayer will be answered; more likely it will not. Tech journalists might be more sensitive to this problem than almost anyone else: each day, our inboxes fill up with anguished requests from users of YouTube and Twitter and Instagram who find their posts blocked or their accounts banned. We can almost never help them.
In the early days of social networks, this did not seem to be of particular concern to the tech companies themselves. Good customer service eats away at profit margins, and executives were convinced that most of it could be effectively automated anyway. But as the platforms grew into monoliths, to the point that even the slightest change to the user interface draws angry Congressional inquiries, the question of tech power has come to feel more urgent.
The questions of power have come to feel particularly acute on questions of speech. This is especially true at Facebook, where the founder CEO has majority control of voting shares and lacks any real check on his power. When a high-profile and thorny issue of speech arises — as it did several times this summer — it lands at Zuckerberg’s feet, for him to make the final call. Given Facebook’s vast size, this setup has ensured that at any given time, millions of people are angry at him and at Facebook for deciding against them. It has also ensured that, aside from complain, they can’t do anything about it.
What the Oversight Board now promises is — well, what if they could?
When it’s fully operational, you will have the ability to appeal a Facebook content moderation decision to an independent body, with the company honor-bound (if not legally required) to accept its decisions. For starters, you will only be able to appeal when you believe your post has been wrongfully removed; eventually, you’ll be able to appeal when you believe a post has wrongfully been allowed to stand. And starting today — ahead of the November 3 US presidential election — Facebook will also be able to refer policy issues to the board and receive advisory opinions on what to do.
All of this has been a significant undertaking. Facebook placed $130 million into an irrevocable trust to fund the board’s operations, and for its initial members recruited a former prime minister, a Nobel Peace Prize laureate, constitutional law scholars and human rights advocates.
There has also been the technical work. Facebook built software that will let it transfer cases to the board in a way that protects users’ privacy, and a case management tool that lets board members choose cases to review, examine outside opinions on the cases and other supplementary materials, and to deliberate with their peers. (The board will eventually have 40 members, but individual cases will be heard by a small panel.)
It has taken more time than Facebook once hoped. And the pandemic hasn’t helped — Facebook had to scrap plans to get board members together for in-person training on the new system. But Brent Harris, Facebook’s head of governance, told me in an interview yesterday that the company had moved a quickly as it could.
“From January 1, we have been in institution-building mode,” he said. “I’m not sure how many institutions in 10 months have actually gotten to a spot where they were then ready to take on a responsibility like this — to take appeals from 2 billion people around the world. So we actually think that we've moved pretty fast on this one.”
At the same time, Facebook and the board have come in for criticism for failing to get to work before voting began in the US election. In September, a group of vocal Facebook critics announced they were forming a rival organization, confusingly titled the Real Oversight Board, to begin issuing immediate opinions on what Facebook ought to do. (The gist was that Facebook should take a lot of things down.)
In any case: the board is here, and I’m glad. But it can all feel a little anticlimactic, mostly because so many of my big questions about the board remain unanswered.
On a call with the board’s co-chairs this morning, I asked how many cases it expects to hear. Will it select a handful each year, like the US Supreme Court, or will it be set up to process more? (It has more than four times as many “justices” as the Supreme Court, but is serving more than 10 times as many “citizens.”)
“It’s a good question — and one that I think we will be developing over time as we see what the volume of cases that are appealed might be, and as we further develop and refine our case selection procedures,” said Jamal Greene, the board’s co-chairman.
Board members made it clear that, while the board can move quickly should it choose to do so, for the most part it won’t. Facebook will continue to moderate the vast majority of all content on its platform, and to hear the first round of user appeals itself. The Oversight Board will hear only a small fraction of cases beyond that. But if the company or the board have any idea how many appeals they’ll get, they’re not saying. (I asked people at both.)
“Facebook was always criticized for moving fast and breaking things,” said Helle Thorning-Schmidt, the former prime minister of Denmark and board co-chairwoman. “I think we are looking at the opposite — we want to look at quality, and look at how we are here for the long term, rather than to move quickly and be under a lot of time pressure.”
The idea is that the board will pick representative cases — ones that will set precedents and cause Facebook to update its policies. If the board had existed when photos of breastfeeding were a big controversy, you can imagine it taking one case in which a user’s photo had been removed and making the case that Facebook should be more permissive, causing it to relax its policies around the world.
Embedded in the Oversight Board is the idea that an entity as large as Facebook ought to have something resembling a justice system. I’m pressing it on how many cases it might hear because I want to know how much justice we can expect from it.
I’m confident that Facebook will use it as a kind of release valve for some of the trickiest policy decisions that it faces. And I’m sure the board will provide thoughtful guidance on a wide range of issues and individual cases where Facebook has erred. I’m less confident that the board can make Facebook feel more just to the average person — the one who logs on to find that their business’s page has been removed, or account has been suspended, or post has been put behind a warning screen. Customer service issues are on some level about justice, but that’s not the kind of justice that the board is set up to provide.
Still: I’m optimistic. For all its faults, the board still represents an unprecedented move to devolve some of a tech giant’s power back to the people that, on some level, it represents. Yes, it will serve to give Facebook public-relations cover during controversies. But it also enshrines the principle that citizens of a platform have a right to redress their grievances. However much justice the board offers them in the future will almost certainly be more than they are getting today.
Weekend question: What would make the Oversight Board more effective? I’ve opened up comments on this post to all readers.
Today in news that could change public perception of the big tech companies
⬆️ Trending up: Google said it would spend $100 million on Black-owned businesses in 2021. It’s part of the company’s commitment to spend $1 billion on suppliers with diverse ownership. (Sundar Pichai / Google)
⬆️ Trending up: Twitter is adding prompts, hashtags, and emoji to promote early voting. It’s part of a surge of pro-democracy moves by the company.(Kim Lyons / The Verge)
🔃 Trending sideways: Twitter and the White House denied claims that a security researcher accessed the president’s account. There are still more questions than answers here; Vice cast additional doubt on the researcher’s story. (Adi Robertson / The Verge)
⬇️ Trending down: Former Facebook employees told Mother Jones that the company knew changes to its News Feed algorithm would boost traffic to far-right publishers at the expense of progressive publishers in 2017. This undermines the company’s long-running argument that the News Feed simply shows people what they want to see. (Monika Bauerlein and Clara Jeffrey / Mother Jones)
⭐ Senate Republicans voted to subpoena Mark Zuckerberg and Jack Dorsey so they could complain about their handling of the New York Post disinformation campaign over video chat. The only better use of the Senate’s time that I could think of would be to pass immediate economic relief for the tens of millions of unemployed Americans suffering during the pandemic. Here’s Lauren Feiner at CNBC:
Twelve Republicans on the committee voted to authorize the subpoenas, while all 10 Democrats boycotted the markup in a protest of the committee’s earlier vote on the nomination of Amy Coney Barrett to the Supreme Court.
⭐ The Federal Trade Commission met to discuss its antitrust investigation into Facebook, signaling that the case is reaching its final stages. Among other things, the FTC is looking into Facebook’s history of acquisitions. (Tony Romm / Washington Post)
Do the big tech platforms need a dedicated regulator? The Google antitrust has some legal experts and economists making the case. (Steve Lohr / New York Times)
Republicans are furious that Twitter is asking users to read stories before retweeting them. To be clear, you can still retweet! You just click the button one more time! (Adi Robertson / The Verge)
Artists are also complaining about Twitter’s new push to make people add a comment before retweeting, saying it reduces their exposure (?). Get over it, artists! (Jacob Kastrenakes / The Verge)
The top investigator in the Google antitrust case says there “was not a rush” to sue the company. Jeffrey Rosen took his sweet time about it! (Cecelia Kang / New York Times)
Google AI technology will be used as part of a project to build a virtual border wall. The news comes two years after Google was roiled by internal protests about the company’s contracts with the military. (Lee Fang and Sam Biddle / The Intercept)
Amazon workers are threatening to shut down warehouses if they don’t get Election Day off to vote. More than 6,500 employees have signed a petition.(Karen Weise / New York Times)
Facebook and Instagram inadvertently censored protests against police violence in Nigeria. The company apologized today after posts of the protests using the #EndSARS hashtag were blocked. (David Gilbert / Vice)
Facebook removed criticism of the government in Vietnam at the government’s request. Human rights advocates decried the move. (David S. Cloud and Shashank Bengali / Los Angeles Times)
A Facebook-backed project to fund solar energy is being used to power fracking operations in West Texas. Seems less than ideal. (Justin Nobel / DeSmog)
The World Health Organization licensed its library of content related to COVID-19 for Wikipedia’s free use. The materials include “Mythbuster” infographics designed to address misinformation about the disease. (Donald G. McNeil Jr. / New York Times)
The United States’ lack of a national strategy to fight COVID-19 has prevented the Apple/Google exposure notification apps from getting wide adoption. Ten states and the District of Columbia have released apps so far, to little effect. (David Uberti / Wall Street Journal)
A Portland man named Christopher Howell is developing facial recognition tools to use against police. The move is designed to counter the fact that officers at protests often cover the names on their badges. (Kashmir Hill / New York Times)
TikTok is adding more information to the notifications it sends users who have their content removed. It’s also providing resources to people whose posts are removed for containing content related to self-harm. (TikTok)
⭐ Facebook will begin to charge companies who want to use WhatsApp for Business. Time to recoup that $19 billion investment, almost seven years later. Ingrid Lunden has the details at TechCrunch:
It’s launching a way to shop for and pay for goods and services in WhatsApp chats; it’s going head to head with the hosting providers of the world with a new product called Facebook Hosting Services to host businesses’ online assets and activity; and — in line with its expanding product range — Facebook said it will finally start to charge companies using WhatsApp for Business.
Facebook is testing a chronological News Feed on mobile devices. Among other things, the availability of a chronological feed could help reduce complaints about “censorship.” (Josh Hendrickson / Review Geek)
Influencers are increasingly turning to live-streaming to make money by selling products, recreating the basic logic of QVC in the process. Amazon, Facebook, and Instagram have all launched live shopping features in recent months. (Ashley Carman / The Verge)
Former Google CEO Eric Schmidt called social networks “amplifiers for idiots” at a virtual Wall Street Journal conference. Where is the lie etc etc. (Gerrit De Vynck / Bloomberg)
Things to do
Stuff to occupy you online during quarantine
Those good tweets
Fox News @FoxNewsTwitter, Facebook have censored Trump 65 times, Biden zero, study says https://t.co/qtqlvJ4b8Y
Talk to me
Send me tips, comments, questions, and Oversight Board referrals: firstname.lastname@example.org.