Newcomer
Newcomer Podcast
Talking Threads With the Facebook Whistleblower Frances Haugen
0:00
-57:03

Talking Threads With the Facebook Whistleblower Frances Haugen

Haugen came on the podcast to discuss her criticisms of Facebook as the world embraces the company's new app, Threads

Elon Musk is the liberal elite’s enemy of the moment.

How quickly the bad blood for Mark Zuckerberg is forgotten.

When Zuckerberg’s Meta released Twitter rival Threads, reporters and left-leaning types (myself included) flocked to the new app as a potential refuge from Musk’s Twitter.

The enemy of my enemy is my friend seemed to be the logic of the moment.

I invited Facebook whistleblower Frances Haugen onto the podcast to discuss the sudden embrace of Threads, her ongoing criticisms of how Facebook operates, and her new book, The Power of One.

Haugen, for one, has not forgotten the problems with Facebook. She hadn’t downloaded Threads.

I said on the podcast, “As a reporter, it’s funny to see the reporter class embracing Threads at the moment when two years ago, or even more than that, they would have been so negative and apprehensive about trusting Facebook. I’m just curious watching the pretty upbeat response to Threads, what do you take from that and are you surprised there seems to be some media trust for Facebook right now.”

Haugen was empathetic toward people fleeing Twitter for Threads.

“I think it’s one of these things where the trauma the Twitter community has faced in the last year is pretty intense,” Haugen told me. “People really liked having a space to discuss ideas, to discuss issues, and the idea that they could have a space again feels really good.”

We spent much of the episode getting into the particulars of The Facebook Files and her criticisms of Facebook.

She outlines a core critique in The Power of One’s introduction:

One of the questions I was often asked after I went public was, “Why are there so few whistleblowers at other technology companies, like, say, Apple?” My answer: Apple lacks the incentive or the ability to lie to the public about the most meaningful dimensions of their business. For physical products like an Apple phone or laptop, anyone can examine the physical inputs (like metals or other natural resources) and ask where they came from and the conditions of their mining, or monitor the physical products and pollution generated to understand societal harms the company is externalizing. Scientists can place sensors outside an Apple factory and monitor the pollutants that may vent into the sky or flow into rivers and oceans. People can and do take apart Apple products within hours of their release and publish YouTube videos confirming the benchmarks Apple has promoted, or verify that the parts Apple claims are in there, are in fact there. Apple knows that if they lie to the public, they will be caught, and quickly.

Facebook, on the other hand, provided a social network that presented a different product to every user in the world. We— and by we, I mean parents, children, voters, legislators, businesses, consumers, terrorists, sex- traffickers, everyone— were limited by our own individual experiences in trying to assess What is Facebook, exactly? We had no way to tell how representative, how widespread or not, the user experience and harms each of us encountered was. As a result, it didn’t matter if activists came forward and reported Facebook was enabling child exploitation, terrorist recruiting, a neo-Nazi movement, and ethnic violence designed and executed to be broadcast on social media, or unleashing algorithms that created eating disorders or motivated suicides. Facebook would just deflect with versions of the same talking point: “What you are seeing is anecdotal, an anomaly. The problem you found is not representative of what Facebook is.”

To jog your memory for the episode, in September 2021, the Wall Street Journal published the first in a series of articles, called the Facebook Files, about the company’s cross check program, which gave special treatment to high-profile users when it came to the company’s moderation decisions.

The Journal followed that report with a story about how Facebook’s internal research showed that 32% of teen girls said “that when they felt bad about their bodies, Instagram made them feel worse.”

The third story in the series showed that Facebook’s decision to preference “meaningful social interactions” seemed to have the opposite effect, giving more reach to posts that instigated conflict and anger.

Perhaps most damning in my mind, was the Journal’s fourth story in the series which showed that Facebook had failed to implement internationally many of the table stakes moderation practices it applies in the U.S.

The Journal won a Polk Award for its reporting.

I have at times been skeptical of how damning these stories were.

It’s not that crazy to me that Facebook would want to provide extra attention toward moderation decisions for public figures.

Is Instagram harming teen girls more than Vogue or Cosmo?

So it was fun to finally hash out some of these issues with Haugen on the podcast.

Ultimately, I think we were mostly aligned that we both support much better disclosure requirements for Facebook. Regulators are fighting with both arms tied behind their backs.

I was disappointed, however, that Haugen seemed to bend over backward to come off as apolitical in her critique of Facebook. She didn’t really engage in the obvious political asymmetry: Republicans are clearly much more likely to post the type of content that Democrats would call misinformation.

I think that’s a fair statement whatever you think of “misinformation.”

Anyway, that should give you enough context to dig into our conversation. Enjoy!

Give it a listen

Listen on Apple

Listen on Spotify


Highlighted Excerpts

The transcript has been edited for clarity.

Eric: How would you see a disclosure regime working that still allows companies like Facebook to be flexible and to change?

Frances: I think a lot of people don’t sit and think about what’s the menu of options when it comes to intervening in a problem as complicated as this. I’m really glad that you brought up the idea that these companies’ grow and change, where the next one to come along might not fit the exact same mold of this one. One of the ways the European Union handles that flexibility — and to be really clear, this kind of way of doing regulation of saying disclosure and transparency is instead of something like what’s happening what’s happening in Utah, where Utah is coming in and saying, “This is how you will run your company.” If people are under 18, they have to have parent supervision, no privacy for kids, their parents can see everything — or like Montana coming out and just flat out banning TikTok. Those are kind of “building fences” type roles, where we’re like, “Oh, this is the fence you can’t cross.” And the thing about technology is it moves and changes, and they’re very good at running around fences.

So the alternative is something like what the European Union passed last year, which is called the Digital Services Act. And the Digital Services Act says, “Hey, if the core problem is a power imbalance, right, the fact that you can know what’s going on and I can’t, let’s address that core problem because a lot of other things will flow downstream from it.” So they say, “Hey, if you know there’s a risk with your product, you need to tell us about it. If you discover one, if you can imagine one and you tell us about it. You need to tell us your plan for mitigating it because it’s going to be different for every platform. We want it to unleash innovation. And you need to give us enough data that we can see that there was progress being made to meet that goal. And if we ask you a question, we deserve to get an answer,” which sounds really basic, but it’s not true today.


Eric: Some of these problems that you've identified are just human problems. If you talk about sort of the Instagram critique with it, potentially making sort of young teenage women — some segment of them unhappy. I mean, you could say, like, was that so different from Vogue? Is this really an algorithmic problem?

Haugen: There have always been teen girls that were unhappy about their bodies or how nice their clothes were. But there are a limited number of pages of Vogue every month. The second time you read Vogue, you’re going to have a different impact on you than the third time you read Vogue. Or you’re going to get bored of it? And in the case of something like Instagram, Instagram progressively pushes you towards more and more extreme content.

With a 13-year-old girl, she might start out by looking for something like healthy recipes. And just by clicking on the content get pushed over time towards more and more extreme materials.


Eric: Why did you decide to come out and reveal your identity?

Frances: I had been contemplating for quite a while would I have to come forward at some point. I had a chance to talk to my parents about it a large number of times just because what I was seeing on a day-to-day basis while I lived with them during COVID was so different from Facebook’s public narrative was on these issues. But the moment where I was like “okay, I have no other options” was right after the 2020 election — so this was in December, less than 30 days after the election — they pulled us all together on Zoom and said, You know how for the last four years, the only part of Facebook that was growing was the Civic Integrity team. So it was the team, for Facebook.com aimed to ensure Facebook was a positive social force in the world, that it wasn’t going to disrupt elections, it wasn’t going to cause any more genocides because by that point there had been two. They said, Hey, you are so important; we’re going to dissolve your team and integrate it into the rest of Facebook.

And when they did that, that was kind of the moment where I realized Facebook had given up. That the only way that Facebook was going to save itself was if the public got involved. That the public had to come and save Facebook.

0 Comments
Newcomer
Newcomer Podcast
A podcast about Silicon Valley, hosted by newsletter writer and independent journalist Eric Newcomer. Listen in for interviews with the dealmakers and builders who matter. Subscribe to newcomer.co for summaries of the episodes plus tech industry news, scoops, and analysis.