Oct 17, 2025
Transcript
[RADIOLAB INTRO]
SIMON ADLER: Right in here.
KATE KLONICK: Awesome.
SIMON: You're gonna be speaking into that microphone.
KATE KLONICK: This guy?
SIMON: Nope, the one closer here.
KATE KLONICK: Okay.
SIMON: Hey, I'm Simon Adler. This is Radiolab.
SIMON: One, two, three, four, five. Can you hear me Kate?
KATE KLONICK: Yep.
SIMON: And that ...
KATE KLONICK: Oh wow!
SIMON: ... is Kate.
KATE KLONICK: Yeah. Kate Klonick. I'm a professor at St. John's Law School.
SIMON: I've talked to her a bunch over the years. We did a couple different stories that felt like news at the time about Facebook's rules for what we can and can't post on their platform.
KATE KLONICK: Don't get me saying the F word again, because last time my parents yelled at me. [laughs]
SIMON: Did they?
KATE KLONICK: Yeah, they were like, "Kate, you're an adult now."
SIMON: Oh, come on!
KATE KLONICK: "You're a serious person."
SIMON: I prefer to swear on the radio as much as possible.
KATE KLONICK: [laughs]
SIMON: We covered the origins of these rules and just how complicated they can become. But beyond the specifics, what we were really exploring was how the ideal of free speech plays out in different spaces in our society—you know, from a good old public square where anyone can say anything they want to lightly-regulated broadcast TV to straight-up private spaces. And we were asking, like, where does social media fit into all that? And, you know, I kinda thought we were done talking about all this, but then ...
[ARCHIVE CLIP, Jimmy Kimmel: I'm happy we still have a show too, I guess.]
SIMON: ... this past month ...
[NEWS CLIP: Jimmy Kimmel can't say that anymore. The late night host taken off air indefinitely.]
SIMON: ... as we all know, free speech was in the news again.
[ARCHIVE CLIP, Brendan Carr: I mean, look, we can do this the easy way or the hard way.]
[ARCHIVE CLIP: That's censorship. That's state speech control.]
SIMON: And these questions of who can say what where, and how much pressure the government can and can't exert just felt fresh and vital all over again. And so I called Kate, yeah, to see how this is all playing out online.
KATE KLONICK: Yeah. And now it is a problem of okay, how do we stop billionaires and authoritarian governments from twisting these platforms into censorship machines or political propaganda?
SIMON: Okay. [bleep].
KATE KLONICK: I know. That's kind of how I feel, too.
SIMON: [sighs] Well, I guess before we get into all of that, let's build a bit of a foundation first.
KATE KLONICK: Sure.
SIMON: So I guess how has the actual practice of keeping stuff up and taking stuff down changed, and why?
KATE KLONICK: Sure. So the main thing—the main thing from the last time we talked that has really, truly changed from, like, 2020 to 2025 is the rise of TikTok. I mean, if you will remember, in, like, two short years it had basically caught up with twelve years of Facebook's growth. And I mean, TikTok has a different way that they run their content moderation.
SIMON: Okay. How so?
KATE KLONICK: Well, when we spoke in these past episodes, one of the assumptions of content moderation when it was getting off the ground, be it Facebook or Instagram or YouTube, was that we don't want to censor people unnecessarily.
SIMON: Yep.
KATE KLONICK: And so you would keep content up until it was reported as being harmful, and then you would make rules that would limit and try to "preserve voice" as much as possible, as they put it, that was like the industry term for free speech: "voice." There were limits to that, obviously, but generally, like, it was a "keep it up unless we have to take it down" type of thing. But that's not TikTok. TikTok comes from, obviously, China, and it comes from a censorship, kind of authoritarian, CCP culture. And, I mean, I believe the Chinese kind of approach to speech is very reflected in the algorithm that TikTok uses. It is not a default, "everyone should see everything. This is a free world, and people have a right to say whatever they want, even if it's a private platform." It is a "we get to determine what people see and say, and that's it."
SIMON: So they're just taking tons and tons and tons of stuff down.
KATE KLONICK: Oh, I mean, no. Like ...
SIMON: [laughs] Okay.
KATE KLONICK: TikTok, it pre-screens such a volume of content that they determine to not be outside of certain political parameters, and so they're less likely to cause "negative interaction effects," to put kind of an economic term on it.
SIMON: Uh, if I can put a stupid man's term on it, it's like they are choosing to push things up instead of pull things down.
KATE KLONICK: That's a perfect way of thinking about it. And they push things up that are very milquetoast, very, like, happy, make you feel good, very apolitical. And so this is basically downranking or shadow banning—the idea that you're going to manipulate the algorithm to not delete the content, but not promote it. And in addition to that, the algorithm is constantly improving and iterating on all the behavioral signals that you give it, and so it's able to provide a very addictive and expectation-meeting ...
SIMON: Product.
KATE KLONICK: Yeah. Product. I mean, there's no way—I almost said "experience," but I'm like ...
SIMON: Yeah.
KATE KLONICK: ... it's kind of, but it's not. It's a—it's—I don't know what it is.
SIMON: I have a confession, which is that I've maybe spent five minutes on TikTok in my life.
KATE KLONICK: I don't have TikTok.
SIMON: You don't either?
KATE KLONICK: Well, I have, like, rules for some of these things. [laughs]
SIMON: Okay.
KATE KLONICK: But, you know, I study online speech for a living, so it seems kind of crazy, but I—like, I don't need to actually be on TikTok for TikTok to be all over my life. I see TikTok videos constantly. They are cross-posted. I don't need to actually be on TikTok.
SIMON: Well, and on that, it is interesting that TikTok figured out how to make banal stuff compelling, because we were certainly told that well, the reason Facebook wants to leave some of this stuff up is because it's the—it's the highly-emotive, highly-reactive stuff that keeps people around. So what—what did we have wrong there? Was this just like an adjacent path to the same outcome, which is keeping people on a platform?
KATE KLONICK: Oh, I mean, I think that it's actually fascinating. You know, what they figured out is that it is a format of video that people are—are hooked by.
SIMON: Okay.
KATE KLONICK: And so it does not really matter. You will find yourself often watching things that you didn't know you were interested in but, like, you're just compelled by certain types of couples that, like, look very different from each other, doing any type of, like, interaction.
SIMON: Fascinating. So it's like Facebook figured out the sort of information that would keep you there. TikTok figured out how to package any information to keep you there.
KATE KLONICK: Yes, that's, like, one way of thinking about it.
SIMON: Oh, God!
KATE KLONICK: Yeah. I mean, you know, but this is not new. I mean, like, advertisers have been doing this forever.
SIMON: Sure.
KATE KLONICK: Right? Like, this is, you know, it's just a very different business model. It is a very different product model.
SIMON: And it seems to then be a very different informational ecosystem you're creating, because if you're pushing up everything that falls within certain bounds, and you're deciding what those bounds are, it becomes far more—like, is "controlled" the right word? What—what's the word?
KATE KLONICK: Yeah, it's controlled, but it's also in, like, a certain way, it's even more dangerous, because, like, the ultimate in censorship in American First Amendment law is really prior restraint.
SIMON: Wait, sorry. What is prior restraint?
KATE KLONICK: Prior restraint is censorship before something goes up or is ever published.
SIMON: Oh, so it's not redacted. It's that it was never printed.
KATE KLONICK: Exactly. That is the exact distinction. And it's important because the existence of the redaction, the proof that it was removed from Facebook, is actually evidence that censorship has happened, right?
SIMON: Right. Right, right, right, right.
KATE KLONICK: Whereas with TikTok, you never even know what you missed. You never even know what you were kept from seeing. And that is really unfortunately what we're staring down at this moment, because in the last five years, American social media has moved toward a TikTok approach to content moderation.
SIMON: Wow. Okay, I didn't expect us to be talking about TikTok so much, but I'm glad we have. So if I'm telling the story of this, it's like, once upon a time, Facebook creates content moderation for everything, all these policies, all these rules. Meanwhile, TikTok is sort of lurking across the Pacific, eventually jumps over, and Zuckerberg and the Silicon Valley folks see they're doing it this very different way. When does that actually start to shift, not just the way Facebook is thinking about its content moderation, but also maybe the way people are experiencing Facebook as a result?
KATE KLONICK: That is not as clear. But the biggest sea change is the one that you're thinking of.
[ARCHIVE CLIP, Mark Zuckerberg: Hey, everyone. I want to talk about something important today, because it's time to get back to our roots around free expression on Facebook and Instagram.]
KATE KLONICK: Which is the one that happened on January 7th of this year, 2025, when Mark Zuckerberg announced the end of the fact-checking program ...
[ARCHIVE CLIP, Mark Zuckerberg: We've reached a point where it's just too many mistakes and too much censorship.]
KATE KLONICK: ... and that he was going to try to move towards a community notes-based system of content moderation.
[ARCHIVE CLIP, Mark Zuckerberg: So we're gonna get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms.]
KATE KLONICK: And I mean, I think that, like, it was and it wasn't a sea change.
SIMON: Okay, well, and talk to me, like, when we say Facebook got rid of its fact-checking, at its sort of height, what was Facebook's fact-checking?
KATE KLONICK: Okay, so not much. [laughs]
SIMON: Okay.
KATE KLONICK: Which is why this was a really—which is why this was such a—such a frustrating announcement, and it was frustrating that the media focused on it so much. The fact-checking was like a commitment to fact-checking because there had been so much clamor about mis- or disinformation, but they were removing posts days after they were flagged and, like, it was very small. And so to watch it go on the chopping block was really more of a signal to a very particular person, and to a very particular party that felt like big tech censorship was coming for them. And, like, you know, we can get into a whole kind of conversation about whether or not that was reality-based, but that was kind of the complaint.
SIMON: Right. And if I'm gonna mount the best defense for conservatives about censorship by big tech, it would be that during the pandemic, there was sort of a party line as to what was an acceptable way to talk about the origins of the pandemic, right?
KATE KLONICK: Yeah.
SIMON: And ...
KATE KLONICK: You can even go before the pandemic.
SIMON: Okay. You could take it before.
KATE KLONICK: You can—there's a few things, and one of them was ...
[NEWS CLIP: There are serious questions for Joe Biden this evening following the publication of emails allegedly belonging to his son Hunter.]
KATE KLONICK: The Hunter Biden laptop scandal.
[NEWS CLIP: Reporting lays out purported emails between Hunter Biden and a Ukrainian businessman.]
KATE KLONICK: New York Post. They broke the story, and links to that were taken off Facebook and Twitter. That was absolutely censored.
SIMON: And what was the justification by Facebook?
KATE KLONICK: Well, that was happening a couple weeks before the 2020 election. And so what had been the huge concern for Facebook and all these other companies was how social media impacted the 2016 election, and so they made a lot of big changes. And one of them was just kind of like we're not going to allow things that could possibly be foreign influenced to stay up, because this is exactly what we got yelled at in 2016. And so they kind of overcorrected. And I think in hindsight, it was a really hard call, and maybe probably the wrong one? And then you extend that to the—the Wuhan lab leak. Now those were just insane, insane issues. And look at us, we're still talking about them today. It's not like they were that censored. Unlike going to, say, China, where it's like, if you're like, "Oh, you know, Tank Man," and they're like, "Who?"
SIMON: Yeah.
KATE KLONICK: Right? Because there are no photos of Tank Man.
SIMON: Right.
KATE KLONICK: They are not published, right? And so it's not—I just also ... [laughs]
SIMON: Point taken. Okay, well so then, like, what has changed then? If yes, there was some censoring going on, and censoring of things in these sort of critical moments, like, would that not happen now? Is that the difference?
KATE KLONICK: I mean, I—my honest belief—I can't predict the future, but my honest belief is that the administration would very quickly put the platforms in line. Yeah, I think that there would be no hesitation to do this, because I don't think that this was ever about free speech. It was about their speech. And that is—that is really what you're unfortunately seeing right now. There is no recognizable free speech notions coming out of this current administration, and with the TikTokification of social media, people have seen the vector for power that is in content moderation.
SIMON: Okay. So Kate, you were saying that TikTok has this fundamentally different approach to content moderation, that instead of reactively taking stuff down, they are proactively flooding the zone with happy-making stuff; that Facebook and X and others have taken notice and started adopting this approach; and that all this has happened as folks have begun to see that content moderation itself is I think you said a vector for power.
KATE KLONICK: Yeah, I think that basically what you're seeing is the power over what appears in your feed or doesn't appear in your feed, or the types of new content that you're recommended, or the first commenters that you see on a video that you just watched. That type of control is an ability that we've never seen before. [laughs]
SIMON: Okay.
KATE KLONICK: I remember when I was first writing about this in, like, 2017, 2018, presenting my research, one of the things that people were so concerned with was filter bubbles.
SIMON: Yeah.
KATE KLONICK: "Well, we're gonna be in these filter bubbles fed to us by the algorithm." And as it turns out, that was one, very true that that would happen, but also, even maybe more disturbingly, we don't even need filter bubbles anymore. People are just choosing platforms based on the types of content that they expect to find there.
SIMON: And in that way, so if we were gone from filter bubbles to platform islands where the owners of the platform get to push up whatever it is that fits whatever their ideological ends are—China and TikTok, it seems to be, like, milquetoast stuff that's not gonna rile you up but it's gonna keep your eyeballs on here—it feels a little bit like X, formerly Twitter, is the mirror image, where it's like, "We're just gonna rile you up all the time." Is that right, and is that what we're just gonna see more of, which is, "Come to this platform island for emotion A; come to that platform island for emotion B"?
KATE KLONICK: I think that that's exactly right. I mean, yeah. I mean, that's what we go to the movies for. That's what we tune into, like, certain types of things for, right? It's I'm not in the mood for, you know, a horror film, so I don't go to a horror film. This kind of approach is much easier to moderate. People get much less upset.
SIMON: Yeah. Yeah.
KATE KLONICK: And it's much cheaper, because there is not as much reactive content moderation to do. You don't have to employ hundreds of people in call centers to review every report of something that's been flagged, and so this has kind of become the new standard.
SIMON: I remember one of the big questions, probably in the first piece we did, it was this question of, like, what kind of space to consider Facebook? Because the First Amendment treats private spaces differently than public spaces, so it matters whether or not Facebook is more like a mall or a public square. And so given all these changes you just mentioned, like, what is the metaphor now? I have one based on what you've said, but I'm curious what yours is.
KATE KLONICK: No, I mean, I've always liked the "mall" metaphor, and it has a weird squirrely little place in First Amendment law in a bunch of cases, but I want to hear what your—I kind of want to hear what yours is.
SIMON: Well, to me it's now—or certainly the direction things seem headed based on what you've said is that it's now, just—it's just broadcast again.
KATE KLONICK: Yeah.
SIMON: And with broadcast there is no free speech right.
KATE KLONICK: No.
SIMON: Like, ABC, NBC, they can cancel a show at any time, they get to decide exactly what the evening lineup is. But with this, with social media it's like—it's like a broadcast camouflaged as an organically-generated thing.
KATE KLONICK: A hundred percent. You know, you can shadow ban or take down or limit the reach, but it doesn't even have to be that subtle. Like, Elon Musk always showing up in my feed even though I don't follow Elon Musk is like having Rupert Murdoch in, like, the interstitial spaces before every commercial break at Fox News, you know, like, directly telling me what I should think. That isn't subtle.
SIMON: Yeah.
KATE KLONICK: Like, that is the other thing about this that is maybe the scariest part of the last couple of months is that none of it even is super pretextual. Like, there isn't a lot of, like, excuses. We're not even hiding behind algorithms anymore. It is just the owner of the—of the platform saying the thing out loud, and forcing everyone to see it if they're on his platform. You know, I think that if you're going to all of these different platform islands, the other thing is like, how do we change this? To use regulatory regimes to try to control how they speak is obviously a problematic thing by any type of measure. We don't want governments controlling speech for the exact reason of all of the authoritarianism we've just discussed. And so I think that there's—it's very hard ...
SIMON: Sorry. If I can jump in there, though, but it does—like, yeah, I'm not for and have never been for the federal government coming in and molding Facebook's content moderation policy.
KATE KLONICK: Of course not.
SIMON: But if something no longer resembles a public square at all, and instead has become—to keep reusing my label—like a camouflaged broadcasting network, where it's like, yeah, these are individuals saying something that they believe in, but then that is being collated, amassed and pushed out as an opinion-changing product by someone on high, I am okay, at that point, with there being some sort of regulation. It's not regulating maybe what people are allowed to post, but maybe how it's being aggregated? I don't know, there has to be some clever—somebody smarter than me who could come up with these sorts of rules.
KATE KLONICK: No, I mean, like, every western state has some type of media regulator specifically to avoid maybe, like, two or three people controlling all of media.
SIMON: Right.
KATE KLONICK: Right? But all of a sudden we're, like, on the internet, and yes, there is an infinite amount of content on the internet, but is it so infinite? Like, if there are—if we're talking about, like, the same three main places that people are going to for their news, people are going to for, like, their—for their daily interactions, people are going to to feel like they're part of a conversation, their water cooler, their public square, whatever it is, if that is, like, three people and they're all friends of the president, like, that's—that's a problem. And maybe even more importantly, journalists, they go to X, they go to Bluesky, they go to YouTube, they go to TikTok, and they report things that are happening in those places as if they're real places that things are happening. But they're also controlled by these individuals, and so they're not reflective necessarily of real world, yet they're being reported on as if they were reflective of real world, right? And so I just think that what you see in the last five years is an industry understand the power that it holds in content moderation, that it's so not a customer service issue, that it is actually like a huge, huge force for shaping public opinion, and that that has exponential value to political parties and governments. It's like, as valuable as oil and guns, because how you push things, what you keep up, what you take down, I mean, this is how you can basically create, you know, the rise and fall of presidencies if you want to, or political parties. And they know how to market them to you no matter how niche you are. And that's scalable, and so, like, it's a way to make a lot of money, and then it's a way to control a lot of minds.
SIMON: You know, I think one of the reasons you and I have gotten along so well over the years and have worked so well together in this now trilogy of stories is that we both have sort of an unorthodox approach to this. I mean, most people were saying that these Facebook guys were idiots, that they're bad, that they're causing lots of trouble, that we should just, like, cast scorn upon them.
KATE KLONICK: Yeah.
SIMON: And you, and then me sort of following your lead, was more like, "What if we actually try to understand this problem?" And I guess now with hindsight, I'm wondering, like, did we miss something here? Were we sort of played the fool?
KATE KLONICK: You know, it wouldn't be the first time that someone has told me that in some way I'm a useful idiot to Facebook or in some type of capacity.
SIMON: I didn't—I would say we would be useful idiots. I didn't call you. I'm asking if we are, is the question.
KATE KLONICK: I feel as if a lot of people, and a lot of what we've said today, people will be like, "Of course this is what happened. This is what we were saying would happen." But it wasn't fait accompli when we talked about it. It wasn't. Every single one of these solutions has the same flaw at the end of the day to it, which is that these are for-profit companies that do what they want to do, and things change as things settle. So I don't know.
SIMON: Okay, well so then, like, is content moderation sort of dead?
KATE KLONICK: I just—this is like a—this is like a very controversial thing. It really depends on what you mean by that question. There has been a lot of controversy around, like, are they going to invest in these huge-cost centers of trust and safety? Are they going to care about this type of issue if they can TikTok-ify everything and just send you down these rabbitholes of endlessly drooly, like, eye-glaze-over, like, Wall-E kind of scene where you're on the couch with your Slurpee, like, Barcalounger or whatever, like, watching things? Is that what they're basically going to do, and are they going to have to keep moderating? And I mean I think that, like, the answer is that we're going to increasingly see an automated content moderation system. It's going to increasingly not embody the edges of society and the range of voice that we had at the beginnings of the internet, and that we are going to kind of see a product-ification of speech.
SIMON: I'd love to give you one more idea that I've been playing around with for a couple of years.
KATE KLONICK: Yeah!
SIMON: If I was ever going to write a short sci-fi story, it would be about the quote-unquote, "perfect piece of art." You step in front of it, it does a quick facial scan of you, pulls everything about you that it knows from the internet, and then it puts forward an image perfectly generated for you that will evoke a feeling.
KATE KLONICK: Hmm.
SIMON: On Tuesdays it's happiness, on Wednesdays it's sadness, and so it's this visual tableau personalized to every person that evokes the same emotion. And once you have that, once you can control the emotions of people with the flip of a dial by putting something in front of them that's going to only pique that feeling for them, then you could just control everybody.
KATE KLONICK: Wow. I love that, it sounds like a Ted Chiang story, honestly. But that's—you know, you should write that. Maybe you can ask AI to do it for you if you're really busy.
SIMON: [laughs]
SIMON: This story was reported and produced by me, Simon Adler, with some original music and sound design by me. Mixing done by Jeremy Bloom. Of course, huge, huge thank you to Kate Klonick as always. And yeah, we will be back next week saying some more things. Until then, thanks for listening.
[LISTENERS: All right. I think we're using this one. Hello, hello. Oh, I can hear myself! Kids-podcast crossover special. Hi, I'm from—wait, I'm Nora Silton, and I'm from New York and here are the staff credits. Radiolab was created by Jad Abumrad, and is edited by Soren Wheeler. Dylan Keefe is our director of sound design. Lulu Miller and Latif Nasser are our co-hosts. Our staff includes: Simon Adler, Jeremy Bloom, W. Harry Fortuna, David Gebel. Oh, so I just have to read that one name? Okay. Who? Oh my God! Sindhu Gnanasambandan, Annie McEwen, Alex Neason and Sarah Qari. Oh, Sarah Sandbach, Anisa Vietze, Arianne Wack, Pat Walters, Molly Webster and Jessica Yung. Yeah, yeah. I see it. Do I sound, like, happy? With help from Rebecca Rand. Our fact-checkers are Diane Kelly, Emily Krieger, Anna Pujol-Mazini and Natalie Middleton.]
[LISTENER: Hi, this is Laura calling from Cleveland, Ohio. Leadership support for Radiolab's science programming is provided by the Simons Foundation and the John Templeton Foundation. Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation.]
-30-
Copyright © 2025 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of programming is the audio record.