The Senate blasts tech CEOs over child safety

After another empty spectacle, can anyone find a path forward?

The Senate blasts tech CEOs over child safety
Mark Zuckerberg speaks to families during Wednesday's Senate hearing on child safety in Washington, DC. (Anna Moneymaker / Getty Images)

Each time a reluctant tech CEO is dragged before Congress to answer questions about the harms that take place on their platforms, I strive to keep an open mind. Perhaps this will be the time, I tell myself, that we hear a productive discussion on the much-needed reforms that tech companies are often too slow to implement. 

But while Congress is generally more educated on tech subjects today than it was when the backlash began in 2017, the hearings still play out much as they did at the beginning: with outraged lawmakers scolding, questioning, and interrupting their witnesses for hours on end, while bills that might address their concerns continue to languish without ever being passed. With so little of substance accomplished, the press can only comment on the spectacle: of the loudest protesters, the harshest insults, and the tensest exchanges. 

After five hours of combative testimony, Wednesday’s Senate Judiciary Committee hearing on child safety appears destined to be remembered mostly as a tech hearing like any other: long on talk, and short of much hope that it will lead to a bill being passed.

The event’s signature moment was the seemingly impromptu apology that Meta CEO Mark Zuckerberg offered to protesters at the audience. Here’s Angela Yang at NBC News:

"I’m sorry for everything you’ve all gone through," Zuckerberg said after Sen. Josh Hawley, R-Mo., pressed him about whether he would apologize to the parents directly. "It’s terrible. No one should have to go through the things that your families have suffered." [...] 

After he apologized, Zuckerberg told parents that "this is why we invest so much and are going to continue doing industry-leading efforts to make sure that no one has to go through the types of things that your families have had to suffer.”

I say “seemingly” because Meta was eager to get the word out: Zuckerberg’s fully transcribed apology arrived in my inbox moments after he gave it via a Meta spokeswoman, and the company also posted his remarks to social networks.

Ultimately, though, the goal of the hearing was not to get an apology from one of the CEOs. It was to press the leaders of Meta, TikTok, Snap, Discord, and X on the steps they take to prevent children from experiencing various harms that they face on those platforms: bullying, grooming, extortion, and many more. 

Ahead of the hearing, Meta’s former head of youth policy shared “red herrings” to look for at the hearing that participants would use to dodge the real issues: invoking one’s status as a parent; sharing the number of people the company has working on the issue; sharing the number of features the company has introduced to protect children; promoting all the conversations the company has had with stakeholders. 

Touting those numbers allows companies to escape scrutiny of how their corporate structure and design processes often relegate child safety to the sidelines, Vaishnavi J. wrote. 

She suggested alternative questions: “How do you incorporate responsible design into your product development processes? What are your internal review processes and escalation paths to ensure that any existing or new product meets a predetermined set of online safety requirements? Over the last five years, how often have you blocked products from launching because they were not safe enough for children, or withdrawn products from the market after receiving feedback on the harms they were causing?

These are good questions, and I did not hear any good answers to them at Wednesday’s hearing.

Instead we got theater. Sen. Lindsay Graham told Zuckerberg, “You have blood on your hands.” Sen. Tom Cotton pursued what the Washington Post accurately described as “a McCarthy-esque line of questioning” against TikTok CEO Shou Zi Chew, whom he repeatedly asked about his citizenship and whether he was a member of the Chinese Communist Party. (Chew is Singaporean.) 

And, as usual, senators took the chance to play a round of do-you-support-my-bill-yes-or-no with the assembled CEOs, cutting off any non-yes or -no’s before any nuance could be added to the discussion.

The three first-time congressional witnesses benefited from senators’ focus on the veterans. Discord CEO Jason Citron appeared to struggle but was mostly left alone, as was Snap CEO Evan Spiegel. X CEO Linda Yaccarino spun out an elaborate fantasy about X being a 14-month-old company, as if Twitter had never existed, and somehow got away with it. 

Somewhere in all this, a handful of good ideas were heard. Zuckerberg pushed persuasively for age verification at the device level, which among other things would prevent parents from having to navigate child safety controls inside dozens of individual apps on their children’s phones. Some senators pushed for expanded resources for law enforcement to investigate and prosecute those trading child sexual abuse material. 

And as Alicia Blum-Ross, former policy director at Twitch, noted today, platforms have gradually begun to take more seriously the idea that apps should be designed differently for teens than for adults. She argues that pushing for changes in user experience will likely benefit teens more than blocking them from using social media. 

“A more restrictive default, combined with a well-timed forced-choice in the user experience, can provide the friction needed for a teen to reconsider a risky post or comment,” she wrote. “Age-tuned settings, rather than blocking access, is far more palatable for older teens than creating a walled garden they won’t use, leading them to seek out platforms with fewer protections in place.”

All that said, I still worry that both the Senate and the CEOs are falling into the trap of techno-solutionism. There’s no doubt that tech companies can and should reduce harm by working to reduce the spread of bullying, harmful content, CSAM, and extortion. 

But it would be a mistake to lay the broader teen mental health crisis at the feet of tech companies alone. As researcher danah boyd, who has long studied children and social media, wrote this week in a piece criticizing the Kids Online Safety Act:

Bills like KOSA don’t just presume that tech caused the problems youth are facing; they presume that if tech companies were just forced to design better, they could fix the problems. María Angel pegged it right: this is techno-legal-solutionism. And it’s a fatally flawed approach to addressing systemic issues. Even if we did believe that tech causes bullying, the idea that they could design to stop it is delusional. Schools have every incentive in the world to prevent bullying; have they figured it out? And then there’s the insane idea that tech could be designed to not cause emotional duress. Sociality can cause emotional duress. The news causes emotional duress. Is the message here to go live in a bubble? 

The solution is not “make tech fix society.” The intervention we need to an ecological problem is an ecological one. 

For all the hearing’s flaws, I do believe tech companies should face pressure to limit the harm on their platforms. Recent revelations from the state attorneys general lawsuit against Meta have laid out in disturbing detail the extent to which the company identified risks to young people and did too little to reduce them.

But we shouldn’t view the platforms in a vacuum, either. Whatever platforms do to support teens won’t change the fact that mental health care remains broadly inaccessible, dozens of school shootings take place every year, and teens continue to suffer the traumatic effects of living through a global pandemic.

Tech companies may indeed have teens’ blood on their hands, as Graham told Zuckerberg. But we should never forget that Congress does, too.


X's day in court

The first National Labor Relations Board hearing for Elon Musk’s X Corp. wrapped up yesterday. We don’t have a decision — the briefing period alone extends until March — but the facts don't look good for Musk’s company.

To recap, this case involves Yao Yue, a former principal engineer at X who was fired on November 15, 2022 after tweeting the following (and posting a similar message on Slack): "Don't resign, let him fire you. You gain literally nothing out of a resignation."

Yue filed an unfair labor practice charge last March, and the NLRB issued a complaint alleging the speech was protected and the firing was illegal. 

Musk’s lawyers, from Morgan, Lewis & Bockius — the same firm representing SpaceX in a similar case — argued the complaint was “dead on arrival” because, in their view, Yue was a supervisor (a classification that is typically not covered by the National Labor Relations Act), and her message to colleagues was insubordinate. 

Yue was a manager on the infrastructure team prior to the November 4 layoffs. But afterward, when X went through a series of reorganizations, she was demoted to an individual contributor. These facts were supported by two witnesses, including former global head of infrastructure Nelson Abramson, and evidence introduced by Musk’s team which inadvertently showed that Yue’s teammate was seeking work from home approval from Yue’s manager rather than Yue. 

Nelson estimated that prior to the Nov. 4 layoffs, the infrastructure team had roughly 1,000 people on it, including around 150 managers. After the layoffs, when Musk specifically targeted managers, the team had maybe 10 or 15. He noted that after Nov. 4 Yue wasn’t a manager and had no authority as a manager. 

The second point, that Yue’s tweet was insubordinate, doesn't seem to hold water. Yue was engaging in collective action, telling her colleagues that it wasn’t in their best interest to resign, rather than ordering them to disobey Musk’s directive. “I’d be shocked if this board found that to be insubordination,”  former labor board member Wilma Liebman told Bloomberg

Musk’s lawyers didn’t even seem to understand the difference between a quote tweet and a reply. At one point, they tried to say that Yue’s offending tweet had enormous reach, because it was a “reply” to my tweet, and my account had tens of thousands of followers. Except it wasn’t a reply — it was a quote tweet, which meant it primarily reached Yue’s followers, of which she had fewer than 5,000. 

The wheels of justice grind slowly, and a decision won't arrive for at least a month. Should the NLRB find that X did in fact violate federal labor law, Yue could receive back pay — and the company could be forced to issue a notice informing employees of their right to organize.

Zoë Schiffer


Sponsored
Build Relationships. Catalyze Your Career.

The best leaders are lifelong learners who surrounding themselves with a close knit community of people who offer support and insight. With the rate of change in tech, growing alongside other top leaders who understand the challenges you face is the difference between having a job and the career you want. Round is the private network for senior product and engineering leaders. With peer-based learning, member events, and a vibrant digital community, Round is a catalyst for your career in tech. Apply to join and mention Platformer to skip the waitlist.


On the podcast this week: Kevin and I try Apple's Vision Pro. Then, we react to clips from Wednesday's Senate hearing over child safety. And finally, we dig into how a single car accident took down Cruise.

Apple | Spotify | Stitcher | Amazon | Google | YouTube


Governing


Industry


Those good posts

For more good posts every day, follow Casey’s Instagram stories.

(Link)

(Link)

(Link)


Talk to us

Send us tips, comments, questions, and your online child safety solutions: casey@platformer.news and zoe@platformer.news.