Facebook rethinks Holocaust denial

How real-world violence led Facebook to reverse its most controversial policy

Facebook rethinks Holocaust denial

I.

The biggest trend at platforms this month is changing your mind. Facebook belatedly banned QAnon. Twitter tapped the brakes on retweets. And on Monday, Facebook made one of the biggest policy reversals in its history, banning posts that deny or distort the Holocaust.

Today let’s talk about the company’s long journey to doing the right thing — and what other platforms might learn from Facebook’s experience.

First, the news. Here’s Sheera Frenkel in the New York Times, featuring a quote from CEO Mark Zuckerberg:

In announcing the change, Facebook cited a recent survey that found that nearly a quarter of American adults ages 18 to 39 said they believed the Holocaust either was a myth or was exaggerated, or they weren’t sure whether it happened.

“I’ve struggled with the tension between standing for free expression and the harm caused by minimizing or denying the horror of the Holocaust,” Mr. Zuckerberg wrote in his blog post. “Drawing the right lines between what is and isn’t acceptable speech isn’t straightforward, but with the current state of the world, I believe this is the right balance.”

The move comes two years after a controversy in which Zuckerberg sparked a controversy by pointing to the platforms’s tolerance of Holocaust denial as evidence of its commitment to free speech. (Part of the controversy stemmed from the fact that Zuckerberg, talking to Kara Swisher on her podcast, said that he did not believe Holocaust deniers were “intentionally getting it wrong” — suggesting that Holocaust denial was an honest mistake rather than a malicious ideology in its own right.)

But the 2018 controversy still caught Facebook by surprise, because the policy Zuckerberg cited was not a new one. Facebook had defended the rights of Holocaust deniers since at least 2009, when it had weathered an earlier firestorm over hosting anti-Semitic pages. As with the Cambridge Analytica scandal — another story about which the basic facts had long been known, but which suddenly roared back into the spotlight after fresh news reports — seismic shifts in public opinion forced Facebook to rethink long-held values.

In the next two years, Facebook has accelerated efforts to ban hate speech and dangerous organizations from the platform. (Much of that work seems to have been catalyzed by the Christchurch shooting and its aftermath, in which platforms came under significant pressure to ramp up their enforcement efforts.)

That work was comprehensive enough that, if you didn’t know the history, you might assume that Facebook had carved out an exemption to its hate speech policies specifically to allow for Holocaust denial. That wasn’t the case, but it might as well have been, and ultimately it was untenable. And so — after years of civil rights groups asking the company to do so — Facebook finally caved.

II.

(I feel compelled to add a content warning about genocide to this section, which includes an account of my visit to Auschwitz.)

At a time when platforms face more pressure than ever before to remove offensive posts from their services, it can seem strange that Facebook ever permitted Holocaust denial at all. Historically, though, permitting offensive remarks about the Holocaust has been viewed by some groups as a pillar of America’s free-speech tradition. In 1978, the American Civil Liberties Union famously defended the rights of neo-Nazis to march in Skokie, IL, which was home to many Holocaust survivors.

The ACLU won the case — and lost 30,000 outraged members in the process. (The march never took place.) I learned about the saga as an undergraduate at Northwestern University, which is 15 minutes away from Skokie by car. As it so happened, my years at Northwestern saw a Holocaust denial controversy of their own. A long-tenured professor of engineering had used his university internet domain to post his denialist views, which he had originally published in a book.

Students, faculty, and alumni all called for the web page to be removed. But the university president, while condemning the professor’s views, said that to remove the page would represent an inappropriate infringement upon academic freedom. This position generated yet more outrage, and many alumni canceled their donations to the school. But the president held firm.

As a young journalism student, I was regularly exposed to stories about infringements upon free speech around the world. Viewed in this light, I found the president’s actions courageous. He had taken a costly stand in defense of a principle, and there are few things an idealistic young person loves to see more than to see someone taking a costly and principled stand.

And as a student journalist, I took comfort in the president’s free-speech absolutism. If he would defend the right to speak of a Holocaust denier, I figured, he could also be trusted not to interfere in the affairs of the student newspaper, no matter how often it criticized him.

But the biggest reason I sided with the president as a student, I think, is that the problem of Holocaust denial seemed so abstract. There were no Nazis on my campus; the fascists had not attempted to march in our community in more than 20 years. A steady diet of television shows and Hollywood movies told and re-told the story of how America had vanquished the Nazis and liberated Europe. If a handful of moonbat conspiracy theorists wanted to deny the obvious, I thought, what did it matter?

The summer before my final year of school, the Anti-Defamation League sponsored a trip for college newspaper editors to visit Poland and Israel and learn about Jewish history. While in Poland, we visited Auschwitz, and while touring the grounds our guide made a grisly discovery. Without warning, he stooped down to the ground and picked up a handful of soil. In the dirt were small flecks of white, which he identified as fragments of human bone.

He made all of us look at what he had found. I suddenly felt nauseous and averted my eyes. But the guide said he wanted us to look closely, so that if ever anyone denied the Holocaust to us, we would know that they were lying. The history that had seemed so distant to me in Evanston wasn’t even in the past — it was there in the soil, and in his hands.

When I returned to school, the problem of Holocaust denial still seemed rather remote to me. But the longer I sat with what I saw there, the less I came to believe in the principle of letting deniers have their say on public platforms. To host denialism, the way the university president had decided to, seemed complicit in a real evil. An evil that poses a threat in the present and the future as well as in the past.

III.

I tell this story because I suspect that I am not the only American whose instinctive defenses of the First Amendment have been increasingly challenged by the rise of right-wing extremism — a good deal of which has been organized on, and amplified by, social networks.

As noxious as the university professor’s web page was, it was not connected to a directory of 3 billion other human beings. It was not promoted by the university’s recommendation algorithms, as there were none. It did not appear as a series of ironic memes in a central feed that students were glued to. It promoted an abhorrent world view, but the page itself got very little promotion.

It’s difficult to say how much anti-Semites benefited from Facebook’s policy while it lasted. Nor can we say how much they benefited from similarly permissive policies on YouTube, Reddit, and Twitter.

But it seems relatively clear why Facebook ultimately changed its mind. “My own thinking has evolved as I've seen data showing an increase in anti-Semitic violence,” Zuckerberg said in his post. This analysis from the Centre for the Analysis of the Radical Right played a role in Facebook’s thinking, I’m told. And according to the ADL, anti-Semitic incidents reached an all-time high in 2019. What had started at Facebook as an abstract discussion about speech has become an urgent, tactical discussion about how to thwart real-world acts of terrorism.

Future historians will long debate what precise combination of forces led to the resurgent anti-Semitism and other hate movements we now see around the world. But for far too long, they had an unwitting ally in social networks.

Banning Holocaust denial doesn’t make it go away — Germany does so by law, and far-right forces are still on the rise there (and preaching the gospel of QAnon, to boot). But if you promise to ban hate speech, you have to ban Holocaust denial, too. They have always been one and the same, and the debate over what to do about it was never as abstract or as principled as some of us wanted to believe. I wish I’d seen that sooner. And I wish Facebook had, too.


The Ratio

Today in news that could change public perception of the big tech companies.

⬆️ Trending up: Microsoft helped to bring down a Russian malware botnet that authorities worry could interfere in the election. (David E. Sanger and Nicole Perlroth / New York Times) But security researchers warn that some aspects of TrickBot are likely to survive the purge.

⬆️ Trending up: Mark Zuckerberg and Priscilla Chan donated an additional $100 million to local election officials in the run-up to Election Day. The funds, which will cover administrative costs associated with the election, follow an earlier $300 million gift.

⬆️ Trending up: Facebook donated £1 million to help save the United Kingdom’s Bletchley Park. The former center of Allied code-breaking during World War II is now a museum, and has struggled with losses related to the pandemic. (James Vincent / The Verge)


Governing

Facebook said it will stop accepting ads from groups discouraging people from getting vaccinated. The move comes ahead of the expected arrival of a COVID-19 vaccine. But there are some important nuances here, as Salvador Rodriguez writes at CNBC:

Facebook said it would allow ads like the ones a state delegate candidate in Virginia launched in August, which included the language “STOP FORCED CORONAVIRUS VACCINATIONS! ... All medications have risks, and we believe discussion alone of mandating a vaccine before it’s released, without knowing if there’s long term side effects, is both premature and dangerous.”

However, ads that explicitly discourage vaccines -- including potraying them as ineffective or unsafe, among other things -- will be banned.

Google employees are heavily discouraged from discussing antitrust issues in the workplace. If you say “competition” three times in Mountain View, Google chief legal officer Kent Walker may appear in your mirror. (Daisuke Wakabayashi / New York Times)

The European Union has made a “hit list” of 20 large technology companies that will be subject to more stringent rules about competition. The moves could include efforts to force companies to sell off parts of their services. (Javier Espinoza / Financial Times)

The Philippines has begun registering citizens for a new national identification system that includes the storage of biometric data. Despite the obvious privacy concerns, 73 percent of citizens reportedly support the system. (Jun Endo / Nikkei Asia)

YouTube’s CEO won’t say that she’ll ban QAnon. In an interview with CNN, Susan Wojcicki would say only that YouTube is “looking closely” at the movement. (Kaya Yurieff / CNN)

Sellers on Etsy are getting around a QAnon ban by posting simple “Save the Children” messages, which can often slip under the radar. (Liz Flora / Glossy)

Misinformation is getting more engagement on Facebook this year than it did in 2016. “Comments and shares of articles from news outlets that regularly publish falsehoods and misleading content roughly tripled from the third quarter of 2016 to the third quarter of 2020,” according to new research from the German Marshall Fund, NewsGuard, and News Whip. (Davey Alba / New York Times)

Liberal politics play much better on Instagram than they do on Facebook. “There was 36% more engagement on the top 50 accounts posting about ‘Black Lives Matter’ on Instagram versus Facebook in the past 30 days, and nearly 3x as much about ‘climate change,” according to data from Facebook-owned CrowdTangle. (Neal Rothschild and Sara Fischer / Axios)

Generation Z’s information diet appears to be Instagram posts, tweets, and TikToks all cross-posted to various platforms with little context. Kids these days are smarter about misinformation than their grandparents are, but their continuous “infograzing” can lead to confusion about what’s true and what’s not. (Matthew Choi / Politico)

Mark Zuckerberg and Priscilla Chan are waging a costly battle against Proposition 13, the California law that has frozen property taxes in the state for decades. The couple have spent almost $11 million in support of a ballot proposition that would lift the cap on business property taxes, raising money for schools and other social services. (Teddy Schleifer / Recode)

The Trump administration used Fox News as a laundromat for unverified Russian information about top Democrats. It’s an alarming instance of the US intelligence community intervening in domestic politics. (Marshall Cohen, Oliver Darcy and Zachary Cohen / CNN)

A Twitter tool that measures users’ happiness levels using sentiment analysis has found that people really are more sad in 2020 than ever before. The “Hedonometer” has set multiple records this year as people despair over the pandemic and other social issues. (Casey Schwartz / New York Times)

The famed Macedonian clickbait teens of the 2016 US presidential election can now be found on the right-wing alternative social network Parler, where they are promoting plagiarized versions of articles from US conservative media. In the words of the author, “in their zeal to abandon platform moderation as a censorious philosophy for the libz, conservative apps like like Parler created a market to rip off their own movement.” (Adam Rawnsley / Daily Beast)


Industry

Apple introduced four new iPhones and a smaller version of its HomePod during its big fall event today. The marquee feature this year is access to 5G cellular networks, which could be interesting to Americans if quarantine ever ends. (Verge staff)

The new iPhone also includes a depth-sensing LiDAR sensor, and Snap is already using it to build a new augmented reality lens. (Sarah Perez / TechCrunch)

Elsewhere: The first Snapchat “local lens” is live in London. The feature allows users to create virtual murals on famous landmarks. (Will Bedingfield / Wired)

Facebook Messenger got a big update that refreshes its look while tying it more closely to Instagram. Sarah Perez reports at TechCrunch:

While Instagram  users had to opt-in to the upgraded new feature set in order to also gain access to the cross-platform communication capabilities, Messenger users don’t have to make a similar choice.

Instead, Facebook says this morning that cross-app communication with Instagram will be rolled out soon to users across North America. (At the time of the Instagram announcement, Facebook hadn’t yet confirmed which markets would receive the update first.)

Oculus Quest 2 began shipping today. Facebook’s new VR headset is $100 cheaper than its predecessor. (Adi Robertson / The Verge)

WordPress released a new tool to turn your blog posts into Twitter threads. Please do not do this under any circumstances. (Sarah Perez / TechCrunch)

An internal Google survey found that engineers are having a harder time coding at home. The number of engineers who reported feeling “highly productive” declined 8 points in the quarter that ended in June, down from 39 percent in March. (Nick Bastone and Amir Efrati / The Information)

TikTok rival Triller is exploring a deal to go public through a SPAC. Absolutely nothing about Triller is adding up for me. (Joshua Franklin and Echo Wang / Reuters)

Amazon made an app that lets you point your phone camera at old shipping boxes to trigger augmented reality experiences. If you have literally any idea why this app exists, please get in touch. (Sarah Perez / TechCrunch)

A profile of Steven Crowder, the conservative satirist who has thrived on YouTube by making the site itself his chief antagonist. Crowder has developed a huge audience by making videos that come close to violating YouTube’s guidelines — the sort of “borderline content” the company has promised not to recommend. (Mark Bergen / Bloomberg)


Things to do

Stuff to occupy you online during quarantine.

Listen to this week’s Media Voices podcast. I really enjoyed speaking with host Esther Kezia Thorpe about the Platformer launch and what’s coming next.


Those good tweets


Talk to me

Send me tips, comments, questions, and dramatic policy reversals: casey@platformer.news.