So, it’s going to be a big week — or weeks. With that in mind, let’s look at a handful of topics to keep an eye on as Election Day / Week / Season unfolds: the last-minute policy preparations of the tech platforms; Facebook’s two-front battle on political ads and fact-checking; and assorted electoral odds and ends. For four years the big social networks have sought an opportunity to show that they can police their platforms better than the 2016 election suggested they could.
Tomorrow they get their chance.
I. The platforms look busy
You know how in the Before Times, when your boss walked by your desk, you would instinctively make a series of dramatic movements to indicate that you were extremely busy and therefore deserved continued employment?
The platforms’ behavior over the past week or so has a real boss-walking-by-the-desk energy:
Facebook quietly suspended political group recommendations ahead of the election. The move, which also applies to groups related to social issues, comes amid concerns that algorithmic group recommendations promoted the growth of QAnon and militia groups on Facebook this year.
Facebook will also now alert you of the original date a piece of COVID-19-related content was shared before you re-share it. This bit of friction can alert people to the fact that they may be about to share old, outdated posts, slowing their spread.
Facebook’s elections operation center is up and running, coordinating between 40 (!) different teams within the company. It’s also sharing information with other companies, Facebook’s head of cybersecurity policy tells Charlie Warzel. “My counterpart at Twitter says I call him more than his mother does,” Nathaniel Gleicher says.
Facebook also showed off the label it will affix to posts from candidates and campaigns that declare premature victory.
Instagram temporarily removed the “recent” tab that normally appears when you click on any hashtag. The idea is to stop bad actors from “camping” on popular hashtags and spamming them with misinformation.
Twitter said that starting on Election Day, it will begin adding labels to tweets that claim victory before results are final. This includes tweets from presidential candidates, US accounts with more than 100,000 followers, and tweets that garner more than 25,000 likes, retweets, or quote tweets. I really like this approach, which is focused on examining posts that get the most reach, no matter who tweets them.
Happily, it’s not only tech platforms monitoring digital election threats this week. The Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency has launched a virtual war room to receive reports from local election officials and convey classified information from intelligence agencies about efforts to undermine the election. The move is notable for occurring despite what the Washington Post describes here, accurately, as “an abiding lack of interest in election security from President Trump.”
Of course, platforms have been making adjustments to content policies related to elections, misinformation, and various potential harms since the aftermath of 2016. But this late flurry of moves shows two things. One, there’s a sincere determination to ensure that, no matter who becomes president in 2021, platforms are not seen to have caused it through lax policing of their platforms.
And two, they’re still figuring out the best way to police their platforms.
II. Facebook and political ads
The post-2016 election critique of tech platforms is that they were overrun with misinformation and influence operations, possibly tipping the outcome to Trump. Most of the policy tweaks above are aimed at preventing that kind of scenario from happening again.
But the prevailing 2020 election critique of platforms is that they are biased — either for or against conservatives or liberals — and you see these fears manifested in the two big Facebook controversies that popped up over the weekend.
The first is that “technical issues” prevented some campaigns from running advertisements on the platfrom. On Tuesday Facebook flipped a switch preventing political advertisers from uploading new ads in the week before Election Day, but that switch inadvertently prevented some pre-approved ads from running. The Joe Biden campaign, which has been swinging a bat at Facebook for much of the election, said the issues cost it an estimated $500,000 in lost fundraising.
The issue also affected the Trump campaign, Politico reported, but Republicans were uncharacteristically quiet about what looked like good fodder for further complaints of partisan bias. Facebook published a blog post on Thursday night asserting that the issues had nothing to do with partisanship — and said that, in addition to technical problems, some issues in running ads had stemmed from advertiser confusion over its changing policies.
Mistakes happen, of course, but at the moment any mistake that can be chalked up to partisanship — fairly or unfairly — carries higher stakes. Both Biden and Trump have said they want to repeal Section 230 of the Communications Decency Act, which limits platforms’ liability for what their users publish, and so you can expect to hear more about these issues no matter who wins.
Incidentally, the Facebook story isn’t just “our ads won’t run” — it’s also “look at the ads that are running!” As the Wall Street Journal reported on Sunday, Facebook pages that have had their ads removed for spreading misinformation have been able to easily re-post them by making slight changes to the wording.
III. Facebook and fact checks
The other big bias story that Facebook is up against right now concerns fact checking.
Last month, using a trove of internal audio that was leaked to me, I wrote about how this summer Facebook had doubled down on centrism — committing itself to a position of “neutrality” whenever possible, and often learning that doing so is effectively impossible. The company’s strained efforts at neutrality — what the press critic Jay Rosen has called “the view from nowhere” — can warp platforms in strange ways.
In an important new story in the Washington Post, Isaac Stanley-Becker and Elizabeth Dwoskin lay out just how literally some Facebook employees took the neutrality mandate:
One of the people familiar with internal deliberations said some efforts to improve fact-checking and content review have been stymied by concerns about a disproportionate impact on conservative users. Members of Facebook’s public policy team recently floated a proposal ensuring that a new system to escalate harmful posts do so evenly along ideological lines, the person said, so that 50 percent of the escalated material would be conservative and 50 percent would be liberal, even if the material was not equivalent in potential risk.
On one hand, this idea was not implemented. On the other, the fact that it was even floated raises some concerns. As the political scientist Brendan Nyhan noted: “There is no reason to think the share of false claims is precisely 50/50 at any given time. Every credible study finds that the misinformation ecosystem is asymmetric. That's not bias; it’s a scientific fact.”
It may be a fact, but it is a fact that is at odds with centrism. Centrism presupposes that both sides of the political spectrum are good-faith actors abiding by a shared set of rules. Individual actors may stray into extremism here and there, but the centrist is committed to finding a middle path.
The problem is that the Republican party represents a shrinking number of Americans, and so conservatives at the highest levels have shifted their tactics to include aggressively broadcasting fictions. Rosen made this argument in a new piece Sunday aimed at mainstream journalists, but just as applicable to platform policy teams:
The Republican Party is increasingly a minority party, or counter-majoritarian, as some political scientists put it. The beliefs and priorities that hold it together are opposed by most Americans, who on a deeper level do not want to be what the GOP increasingly stands for. A counter-majoritarian party cannot present itself as such and win elections outside its dwindling strongholds. So it has to be counterfactual, too. It has to fight with fictions. Making it harder to vote, and harder to understand what the party is really about—these are two parts of the same project. The conflict with honest journalism is structural. To be its dwindling self, the GOP has to also be at war with the press, unless of course the press folds under pressure.
I’d go a step further and argue that the GOP has a structural conflict with Facebook as well. To the extent that Facebook and other social platforms are serious about reducing the spread of misinformation, they will always be at war with the Republican party. This war will be waged in various proxy battles — the current fight “against” Section 230 is the most visible — but the real fight is about the freedom to tell big political lies, and receive immediate and wide amplification for them, or else.
If the Democrats win big tomorrow, platforms may feel as if they have newfound freedom to take an even firmer hand against elected officials and other politicians who spread misinformation. “Neutrality” may fall into further disfavor. If not, though, expect to see more of the 50/50 thinking that the Post uncovered.
IV. Bonus links
If you’re a platform policy completionist, or just hard up for reading material as you await results tomorrow, here are some additional items that caught my eye.
Twitter reinstated the New York Post’s account after the Hunter Biden drama. The Post could have gotten its account back by removing an offending tweet, which it refused to do. But Twitter changed its policy, allowing the tweet to remain, and so reinstated the account.
Leaked Facebook moderation guidelines show that the company manually intervened to stop the spread of the Hunter Biden story. It still hasn’t said much specifically about why it intervened, though, except to say it has been on the lookout for a hack-and-leak campaign.
You can also read Facebook’s internal rules on voter suppression and disinformation, which leaked to Vice.
President Trump’s ads are all over YouTube’s homepage through Election Day. His campaign bought out the inventory in February.
Trump campaign poll watchers are expected to heavily promote videos on social networks this week showing evidence of voter fraud. The videos don’t need to withstand close scrutiny to be effective — they just need to help the campaign sell the idea that the election is being stolen, however baseless that might be.
“Parler, Gab, Discord, WhatsApp, Telegram, Reddit, and Twitch do not have election-related moderation policies,” according to the Election Integrity Partnership. The Markup noted this in a comprehensive roundup of platform election policies.
Dangerous radical Joanna Stern suggests not using social networks until the election is decided. Nice try, Joanna.
Today in news that could change public perception of the big tech companies
⬆️ Trending up: Facebook will limit the distribution of the “save the children” hashtag after it was co-opted by QAnon. People who search for the hashtag will now be directed to credible child safety resources. (Brian Heater and Taylor Hatmaker / TechCrunch)
⬇️ Trending down: A review of fact-checking claims during the five-day period after Trump was diagnosed with COVID-19 found that Facebook consistently failed to apply labels to content flagged by its fact checkers. Very small changes to memes previously flagged as false allowed them to be re-posted without labels. (Priyanjana Bengani and Ian Karbal / Columbia Journalism Review)
⭐ A federal judge blocked the government from moving ahead with the Trump administration’s TikTok ban. Jacob Kastrenakes has the story at The Verge:
TikTok is suing the Trump administration and Commerce Department to block its app from being banned, but this ruling actually came from another lawsuit: three TikTok creators who were concerned that the ban would prevent them from earning a living. The judge sided with their argument that TikTok videos constitute “informational materials,” which are protected under the relevant law.
“The short videos created and exchanged on TikTok are expressive and informative, and are analogous to the ‘films,’ ‘artworks,’ ‘photographs,’ and ‘news wire feeds’ expressly protected under” the International Emergency Economic Powers Act, the judge wrote.
Local officials say they are scrambling to fight all manner of misinformation in the run-up to Election Day. Hundreds of people are calling local election officials every day around the country parroting conspiracy theories. (Kellen Browning and Davey Alba / New York Times)
Does Facebook have the most to lose from a Joe Biden victory this week? Democratic congressional staffers and lobbyists say a Biden victory could hamper the company’s lobbying efforts just as critical issues around Section 230 reform and antitrust law take center stage. (Christopher Stern / The Information)
In the Philippines, government misinformation campaigns targeting its critics are being amplified on Facebook — and several victims have now been murdered in extra-judicial killings. After its Free Basics program helped bring people online, 97 percent of the country now uses Facebook. (Peter Guest / Rest of World)
QAnon received an early boost from Twitter accounts linked to the Russian government. “From November 2017 on, QAnon was the single most frequent hashtag tweeted by accounts that Twitter has since identified as Russian-backed, a Reuters analysis of the archive shows, with the term used some 17,000 times.” (Joseph Menn / Reuters)
Apple rejected a get-out-the-vote app focused on Pennsylvania that let users check their phone contacts against a public database to see whether people had voted. The App Store prohibits apps that compile personal information, even from public sites. (Mikey Campbell / Apple Insider)
LinkedIn co-founder Reid Hoffman launched a $1 million ad campaign urging people to be patient while waiting for election results. Unfortunately it’s not working, Reid. I feel very impatient! (Alayna Treene / Axios)
Google must respond to the Department of Justice’s antitrust suit against the company by mid-November. Its response is due November 13. (Reuters)
A group of 46 leading US companies, including Apple, Facebook, Google, and Twitter, are supporting a legal challenge to block upcoming changes to H-1B visa eligibility. The Trump administration wants to shift the system, which is used to import foreign workers, from a lottery-based system to one that allows jobs based on how much they pay. The move could dramatically increase the cost of foreign labor. (Nandita Mathur / LiveMint)
Researchers predict that politicians will increasingly turn to “nanoinfluencers” with 10,000 followers or fewer to promote their campaigns. Loopholes in platform advertising policies make it much harder to track spending on these campaigns. (Arielle Pardes / Wired)
⭐ Dan Bongino, a right-wing commentator whose Facebook page often gets more engagement than any other on the platform, says he is perplexed by his success. Naturally, he complains incessantly about the censorship of conservatives on the platform. Kevin Roose profiles him at the New York Times:
Mr. Bongino, 45, has become a lightning rod on the left, both because of his growing audience and because he has been criticized for posting exaggerated and misleading information. He was one of the most aggressive promoters of “Spygate,” a dubious conspiracy theory about an illegal Democratic plot to spy on Mr. Trump’s 2016 campaign. He falsely claimed that masks are “largely ineffective” at preventing the spread of Covid-19, and has promoted unproven claims about voter fraud as well as stoking fears about a Democrat-led coup. (Mr. Bongino has claimed that he was merely repeating left-wing claims about post-election violence.)
Plenty of people have fact-checked Mr. Bongino. But nobody has figured out what, exactly, has lifted him above the legions of other pro-Trump influencers battling for attention online.
Twitter’s board of directors issued a vote of confidence in CEO Jack Dorsey and its current management team. The activist investor group that threatened to oust Dorsey earlier this year appears to be backing off — for now. (Alex Heath / The Information)
In leaked audio, Facebook moderators in Dublin express concerns about returning to the office during a surge in COVID-19 cases. “‘I think every day, if I lost my husband, if anything happened to me, who could take care of my six-year-old son?” one moderator asked last week, breaking down in tears.” (David Gilbert / Vice)
The chat platform Discord has thrived during the pandemic. The company now has more than 100 million users across millions of communities — as well as ongoing moderation challenges in its various right-wing and conspiracy-oriented groups. (David Pierce / Protocol)
Amazon is looking into opening up small shipping hubs in rural areas around the United States. The move is expected to help the company reduce its reliance on the US Postal Service. (Paris Martineau / The Information)
ByteDance, the parent company of TikTok, unveiled its first hardware product: a “smart lamp” with a display screen. I’m sorry, what? (Manish Singh / TechCrunch)
Digital spokespeople are becoming a larger segment of the influencer marketing industry, which is expected to grow to $15 billion by 2022. These virtual mascots are cheaper than real human beings, and never age — or complain. (Thuy Ong / Bloomberg)
Things to do
Stuff to occupy you online during quarantine
RadiTube is a search engine that makes the content of radical YouTube communities searchable for journalists and researchers. Have fun!
Play a Biden-Harris “Build Back Better” map in Fortnite. “Signs around the map encourage players to visit makeaplan.com, and players can text “Fortnite” to 30330 to learn how to early-vote in person, drop off a ballot, or vote on Election Day.” (Kim Lyons / The Verge)
Distract yourself from the election with the New York Times’ slideshow of calming videos. Just brilliant work from the Styles team here.
Just lie on the floor and scream. A persuasive case that tantrums are underrated. (Jess Zimmerman / Slate)
Those good tweets
The entire Upcoming Oreos Twitter account.
Talk to me
Send me tips, comments, questions, and election predictions: email@example.com.
I think all journalists should disclose their political affliations so the reader of their work can factor in the bias of their articles. Too often, journalists claim to be neutral when in fact s/he is bias towards a party, usually liberals (I'm Conservative. There i said it). Your piece despite efforts to stay neutral definitely show a liberal bias and I know you from your other work. You do not know how Conservative minds think, and until you do, you can not accurately describe how a Conservative feels about political issues.