It all comes down to this

Will tech platforms exorcise the ghost of 2016 on Election Day?

It all comes down to this
People head to the voting booths to cast their in-person absentee ballots at Seacoast Church West Ashley on October 30 in Charleston, South Carolina. (Michael Ciaglo / Getty Images)

So, it’s going to be a big week — or weeks. With that in mind, let’s look at a handful of topics to keep an eye on as Election Day / Week / Season unfolds: the last-minute policy preparations of the tech platforms; Facebook’s two-front battle on political ads and fact-checking; and assorted electoral odds and ends. For four years the big social networks have sought an opportunity to show that they can police their platforms better than the 2016 election suggested they could.

Tomorrow they get their chance.

I. The platforms look busy

You know how in the Before Times, when your boss walked by your desk, you would instinctively make a series of dramatic movements to indicate that you were extremely busy and therefore deserved continued employment?

The platforms’ behavior over the past week or so has a real boss-walking-by-the-desk energy:

Happily, it’s not only tech platforms monitoring digital election threats this week. The Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency has launched a virtual war room to receive reports from local election officials and convey classified information from intelligence agencies about efforts to undermine the election. The move is notable for occurring despite what the Washington Post describes here, accurately, as “an abiding lack of interest in election security from President Trump.”

Of course, platforms have been making adjustments to content policies related to elections, misinformation, and various potential harms since the aftermath of 2016. But this late flurry of moves shows two things. One, there’s a sincere determination to ensure that, no matter who becomes president in 2021, platforms are not seen to have caused it through lax policing of their platforms.

And two, they’re still figuring out the best way to police their platforms.

II. Facebook and political ads

The post-2016 election critique of tech platforms is that they were overrun with misinformation and influence operations, possibly tipping the outcome to Trump. Most of the policy tweaks above are aimed at preventing that kind of scenario from happening again.

But the prevailing 2020 election critique of platforms is that they are biased — either for or against conservatives or liberals — and you see these fears manifested in the two big Facebook controversies that popped up over the weekend.

The first is that “technical issues” prevented some campaigns from running advertisements on the platfrom. On Tuesday Facebook flipped a switch preventing political advertisers from uploading new ads in the week before Election Day, but that switch inadvertently prevented some pre-approved ads from running. The Joe Biden campaign, which has been swinging a bat at Facebook for much of the election, said the issues cost it an estimated $500,000 in lost fundraising.

The issue also affected the Trump campaign, Politico reported, but Republicans were uncharacteristically quiet about what looked like good fodder for further complaints of partisan bias. Facebook published a blog post on Thursday night asserting that the issues had nothing to do with partisanship — and said that, in addition to technical problems, some issues in running ads had stemmed from advertiser confusion over its changing policies.

Mistakes happen, of course, but at the moment any mistake that can be chalked up to partisanship — fairly or unfairly — carries higher stakes. Both Biden and Trump have said they want to repeal Section 230 of the Communications Decency Act, which limits platforms’ liability for what their users publish, and so you can expect to hear more about these issues no matter who wins.

Incidentally, the Facebook story isn’t just “our ads won’t run” — it’s also “look at the ads that are running!” As the Wall Street Journal reported on Sunday, Facebook pages that have had their ads removed for spreading misinformation have been able to easily re-post them by making slight changes to the wording.

III. Facebook and fact checks

The other big bias story that Facebook is up against right now concerns fact checking.

Last month, using a trove of internal audio that was leaked to me, I wrote about how this summer Facebook had doubled down on centrism — committing itself to a position of “neutrality” whenever possible, and often learning that doing so is effectively impossible. The company’s strained efforts at neutrality — what the press critic Jay Rosen has called “the view from nowhere” — can warp platforms in strange ways.

In an important new story in the Washington Post, Isaac Stanley-Becker and Elizabeth Dwoskin lay out just how literally some Facebook employees took the neutrality mandate:

One of the people familiar with internal deliberations said some efforts to improve fact-checking and content review have been stymied by concerns about a disproportionate impact on conservative users. Members of Facebook’s public policy team recently floated a proposal ensuring that a new system to escalate harmful posts do so evenly along ideological lines, the person said, so that 50 percent of the escalated material would be conservative and 50 percent would be liberal, even if the material was not equivalent in potential risk.

On one hand, this idea was not implemented. On the other, the fact that it was even floated raises some concerns. As the political scientist Brendan Nyhan noted: “There is no reason to think the share of false claims is precisely 50/50 at any given time. Every credible study finds that the misinformation ecosystem is asymmetric. That's not bias; it’s a scientific fact.”

It may be a fact, but it is a fact that is at odds with centrism. Centrism presupposes that both sides of the political spectrum are good-faith actors abiding by a shared set of rules. Individual actors may stray into extremism here and there, but the centrist is committed to finding a middle path.

The problem is that the Republican party represents a shrinking number of Americans, and so conservatives at the highest levels have shifted their tactics to include aggressively broadcasting fictions. Rosen made this argument in a new piece Sunday aimed at mainstream journalists, but just as applicable to platform policy teams:

The Republican Party is increasingly a minority party, or counter-majoritarian, as some political scientists put it. The beliefs and priorities that hold it together are opposed by most Americans, who on a deeper level do not want to be what the GOP increasingly stands for. A counter-majoritarian party cannot present itself as such and win elections outside its dwindling strongholds. So it has to be counterfactual, too. It has to fight with fictions. Making it harder to vote, and harder to understand what the party is really about—these are two parts of the same project. The conflict with honest journalism is structural. To be its dwindling self, the GOP has to also be at war with the press, unless of course the press folds under pressure.

I’d go a step further and argue that the GOP has a structural conflict with Facebook as well. To the extent that Facebook and other social platforms are serious about reducing the spread of misinformation, they will always be at war with the Republican party. This war will be waged in various proxy battles — the current fight “against” Section 230 is the most visible — but the real fight is about the freedom to tell big political lies, and receive immediate and wide amplification for them, or else.

If the Democrats win big tomorrow, platforms may feel as if they have newfound freedom to take an even firmer hand against elected officials and other politicians who spread misinformation. “Neutrality” may fall into further disfavor. If not, though, expect to see more of the 50/50 thinking that the Post uncovered.

IV. Bonus links

If you’re a platform policy completionist, or just hard up for reading material as you await results tomorrow, here are some additional items that caught my eye.


The Ratio

Today in news that could change public perception of the big tech companies

⬆️ Trending up: Facebook will limit the distribution of the “save the children” hashtag after it was co-opted by QAnon. People who search for the hashtag will now be directed to credible child safety resources. (Brian Heater and Taylor Hatmaker / TechCrunch)

⬇️ Trending down: A review of fact-checking claims during the five-day period after Trump was diagnosed with COVID-19 found that Facebook consistently failed to apply labels to content flagged by its fact checkers. Very small changes to memes previously flagged as false allowed them to be re-posted without labels. (Priyanjana Bengani and Ian Karbal / Columbia Journalism Review)


Governing

A federal judge blocked the government from moving ahead with the Trump administration’s TikTok ban. Jacob Kastrenakes has the story at The Verge:

TikTok is suing the Trump administration and Commerce Department to block its app from being banned, but this ruling actually came from another lawsuit: three TikTok creators who were concerned that the ban would prevent them from earning a living. The judge sided with their argument that TikTok videos constitute “informational materials,” which are protected under the relevant law.

“The short videos created and exchanged on TikTok are expressive and informative, and are analogous to the ‘films,’ ‘artworks,’ ‘photographs,’ and ‘news wire feeds’ expressly protected under” the International Emergency Economic Powers Act, the judge wrote.

Local officials say they are scrambling to fight all manner of misinformation in the run-up to Election Day. Hundreds of people are calling local election officials every day around the country parroting conspiracy theories. (Kellen Browning and Davey Alba / New York Times)

Does Facebook have the most to lose from a Joe Biden victory this week? Democratic congressional staffers and lobbyists say a Biden victory could hamper the company’s lobbying efforts just as critical issues around Section 230 reform and antitrust law take center stage. (Christopher Stern / The Information)

In the Philippines, government misinformation campaigns targeting its critics are being amplified on Facebook — and several victims have now been murdered in extra-judicial killings. After its Free Basics program helped bring people online, 97 percent of the country now uses Facebook. (Peter Guest / Rest of World)

QAnon received an early boost from Twitter accounts linked to the Russian government. “From November 2017 on, QAnon was the single most frequent hashtag tweeted by accounts that Twitter has since identified as Russian-backed, a Reuters analysis of the archive shows, with the term used some 17,000 times.” (Joseph Menn / Reuters)

Apple rejected a get-out-the-vote app focused on Pennsylvania that let users check their phone contacts against a public database to see whether people had voted. The App Store prohibits apps that compile personal information, even from public sites. (Mikey Campbell / Apple Insider)

LinkedIn co-founder Reid Hoffman launched a $1 million ad campaign urging people to be patient while waiting for election results. Unfortunately it’s not working, Reid. I feel very impatient! (Alayna Treene / Axios)

Google must respond to the Department of Justice’s antitrust suit against the company by mid-November. Its response is due November 13. (Reuters)

A group of 46 leading US companies, including Apple, Facebook, Google, and Twitter, are supporting a legal challenge to block upcoming changes to H-1B visa eligibility. The Trump administration wants to shift the system, which is used to import foreign workers, from a lottery-based system to one that allows jobs based on how much they pay. The move could dramatically increase the cost of foreign labor. (Nandita Mathur / LiveMint)

Researchers predict that politicians will increasingly turn to “nanoinfluencers” with 10,000 followers or fewer to promote their campaigns. Loopholes in platform advertising policies make it much harder to track spending on these campaigns. (Arielle Pardes / Wired)


Industry

Dan Bongino, a right-wing commentator whose Facebook page often gets more engagement than any other on the platform, says he is perplexed by his success. Naturally, he complains incessantly about the censorship of conservatives on the platform. Kevin Roose profiles him at the New York Times:

Mr. Bongino, 45, has become a lightning rod on the left, both because of his growing audience and because he has been criticized for posting exaggerated and misleading information. He was one of the most aggressive promoters of “Spygate,” a dubious conspiracy theory about an illegal Democratic plot to spy on Mr. Trump’s 2016 campaign. He falsely claimed that masks are “largely ineffective” at preventing the spread of Covid-19, and has promoted unproven claims about voter fraud as well as stoking fears about a Democrat-led coup. (Mr. Bongino has claimed that he was merely repeating left-wing claims about post-election violence.)

Plenty of people have fact-checked Mr. Bongino. But nobody has figured out what, exactly, has lifted him above the legions of other pro-Trump influencers battling for attention online.

Twitter’s board of directors issued a vote of confidence in CEO Jack Dorsey and its current management team. The activist investor group that threatened to oust Dorsey earlier this year appears to be backing off — for now. (Alex Heath / The Information)

In leaked audio, Facebook moderators in Dublin express concerns about returning to the office during a surge in COVID-19 cases. “‘I think every day, if I lost my husband, if anything happened to me, who could take care of my six-year-old son?” one moderator asked last week, breaking down in tears.” (David Gilbert / Vice)

The chat platform Discord has thrived during the pandemic. The company now has more than 100 million users across millions of communities — as well as ongoing moderation challenges in its various right-wing and conspiracy-oriented groups. (David Pierce / Protocol)

Amazon is looking into opening up small shipping hubs in rural areas around the United States. The move is expected to help the company reduce its reliance on the US Postal Service. (Paris Martineau / The Information)

ByteDance, the parent company of TikTok, unveiled its first hardware product: a “smart lamp” with a display screen. I’m sorry, what? (Manish Singh / TechCrunch)

Digital spokespeople are becoming a larger segment of the influencer marketing industry, which is expected to grow to $15 billion by 2022. These virtual mascots are cheaper than real human beings, and never age — or complain. (Thuy Ong / Bloomberg)


Things to do

Stuff to occupy you online during quarantine

RadiTube is a search engine that makes the content of radical YouTube communities searchable for journalists and researchers. Have fun!

Play a Biden-Harris “Build Back Better” map in Fortnite. “Signs around the map encourage players to visit makeaplan.com, and players can text “Fortnite” to 30330 to learn how to early-vote in person, drop off a ballot, or vote on Election Day.” (Kim Lyons / The Verge)

Distract yourself from the election with the New York Times’ slideshow of calming videos. Just brilliant work from the Styles team here.

Just lie on the floor and scream. A persuasive case that tantrums are underrated. (Jess Zimmerman / Slate)


Those good tweets

The entire Upcoming Oreos Twitter account.


Talk to me

Send me tips, comments, questions, and election predictions: casey@platformer.news.