To view the full transcript of this episode, read below:
Eugene Volokh: [00:00:00] Welcome to Free Speech Unmuted. I'm your co host, Eugene Volokh of the Hoover Institution. my other co host is, Jane Bambauer of the University of Florida. And we have a special guest today, Congressman Ro Khanna. And, Congressman, tell us a little bit about yourself.
Ro Khanna: thank you for inviting me.
I, appreciate, our back and forth on issues of free speech and have, benefited from it. I Represent, Silicon Valley in Congress. It's my, eighth year. I am, on the Armed Services Committee. and on the, oversight committee and the select committee, for competition with communist China.
Eugene Volokh: And you actually are interested in technology policy. You actually know the subject, you've written a book about the subject, [00:01:00] so we're particularly delighted, delighted, to have you here. so tell us a little bit, what are you, what do you see as the major things that, American technology policy should focus on?
And we think about the free speech, issues. This is a free speech podcast for a little bit of technology podcast as such, but obviously they intersect quite a bit. So if there are any particular things that you have in mind with regard to free speech, Restricting free speech or restricting free speech as you think is necessary, or perhaps doing things affirmatively to help promote free speech, tell us, what you think, we and our listeners and viewers should be thinking about in this.
Ro Khanna: I would say, maybe three broad themes. first, after the printing press, Erasmus went from, praising the printing press to criticizing it, because you had pamphlets, back then that would [00:02:00] put, our social media, make our social media look tame. it actually led to wars in Europe, and it took, humanity collectively a century to build the architecture of, social media.
Liberal democracy so that today we celebrate the printing press. So I think when we're talking about speech, we also need to talk about what constructive speech looks like and how to design digital institutions to foster that beyond the argument of, let's have free speech. How do we have thoughtful speech?
How do we replicate things like the town hall? which is a quintessentially American tradition online. How do we get people to talk to each other listening and with respect? Maybe this podcast is one example. and, there needs to be much more focus on the building part. second theme is that by and large, I err on the side of, free speech.
I, I think it is good to have, different points of view expressed, points of view challenging. The conventional wisdom, of course, ideologically, [00:03:00] the, liberals don't like it when the speech offends our sensibilities that often on issues of race and gender, the conservatives don't like it when it offends their sensibilities, such as support for Israel.
So it is always easy to say I'm for free speech. in principle, harder to do in practice, but on calling balls and strikes, I think we should err, on the side of free speech, especially as we become a multiracial democracy. The one caveat to that is, our youth, I, do think, and I haven't read, but plan to, Jonathan Haidt's book, Anxious Generation.
I do think social media, has had a challenging impact on young people. in terms of addictive algorithms and eating disorders, and also potentially ideation of suicide. And I think we have to do a better job in regulation of social media when it comes to America's youth.
Eugene Volokh: sounds very interesting.
Thank you very much. I have a few follow up questions, but I'm sure Jane does.
Jane Bambauer: so I'm debating whether to stick with social media for a second or to go [00:04:00] broad again. let's stick with social media for a little while. I too, I, share a little bit of the same concerns and nervousness that it sounds like you do, that Jonathan Haidt does, on the other hand, even thinking back to the printing press and its history that you just elaborated, it seems that there, the story was, we wound up in a good equilibrium when we were, when we, found rules that were quite permissive in terms of what people can do and found other, found other ways to, promote, found other rules related to democracy and to limited government and whatnot, to deal with some of the challenges that free speech could otherwise bring.
So I guess I'm wondering, with social media, do you think that this is noticeably different from. Other media institutions that used to be new at one point where we really do need to limit access to it. [00:05:00] Or is this something where, or do you think that there's, going to be some benefit to letting people adapt and letting, parents, and, schools and, other Decision makers that are closer to the source decide how to use these things.
Ro Khanna: That's a fair question. Of course, in some ways, we limit movies and television. certainly we have movies rated in terms of age limits. And, kids get around that, but we try to do that. And we have kinds of television programming that it should be age appropriate and at least giving parents warnings.
But I do think that the social media has a different element in terms of the, algorithms that can get people, young people, addicted, and getting a certain type of content, that [00:06:00] has just anecdotally and statistically caused, Now, there's advantages to the young people who may feel isolated from their own families, who may be, gay or marginalized and use social media as an outlet that they, wouldn't have had.
but I, I think that there needs to be a better job, on the enforcement of people under 13, to the extent we have COPA. how you actually enforce that, and then, thinking about between, 13 and 16, or we can debate the age, what, regulations, protect, children and aren't causing, suicidal thoughts or eating disorders or things like that.
And I, guess my balance on that would be, more protective of children. Of children's well being, then, as absolute a supporter of speech.
Jane Bambauer: What about the threat from what about what? [00:07:00] What do you think of the tick tock law? and or the divestiture? law that, that, that is going to presumably be taking effect, although, of course, TikTok's challenging it.
That, that's a little different from what you've said so far, because I think this, the threat from TikTok is not just to children, but to the general public. there too, do you see, the medium as being significantly different from, newspapers and leaflets and other things that we've seen before?
Ro Khanna: I voted against the ban. I, do see it as different from, the newspapers and others in that it has an algorithm, that, is, getting people potentially hooked onto something and showing a particular type of content. My concern on the ban was that I just thought, it was way too broad.
I wouldn't have a ban of, all [00:08:00] Facebook and Instagram and YouTube, just because I think it may be having a negative impact on, some kids. I think there are other narrowly tailored ways to regulate it. And Eugene and I actually had a conversation when I was thinking about voting on it and, you would need a national security argument.
I think that's what the courts will look at. They may give deference to Congress finding that there was national security consideration and the president finding that. But the evidence that I saw didn't rise to that level. It was all a hypothetical. the Chinese Communist Party could use this potentially to do that.
And I don't think Banning 170 million on a hypothetical is wise. I think we should have had more narrowly tailored laws, such as saying, a foreign government should not be allowed to interfere, in the algorithms of, a social media company in the United States, and that could have criminal or civil penalties.
but I thought the ban was overbroad.
Jane Bambauer: [00:09:00] Eugene, do you want to go back to some of your questions? no,
Eugene Volokh: these are all the relevant ones. So I do think with a lot of these things, the devil is in the details, both in terms of what policies specifically are the ones that will help better protect speech and better protect children and such, and also just in particular, how you're going to implement them in ways that actually end up working.
So here's one, one thing that I wanted to ask about, because this is an issue that comes up pretty commonly. On one hand, children are not the same as adults, and the law in various ways recognizes that. for example, sexually themed material generally can be restricted when distributed to children in ways that they couldn't be when distributed to adults.
There's material that's viewed as obscene as to minors. that is lawful for adults to buy, but not lawful for people to sell to 12 or 13 year olds, let's say. There have been [00:10:00] attempts to try to do the same with regard to other material, like violent video games. The court has rejected that. So exactly where the first German line is drawn is hard to know, but there is some distinction there.
Here's part of the problem, though, that restrictions that are aimed at shielding minors will, in many ways, end up also affecting adults. for example, with regard to pornography. if you even before the Internet, if you card people in order to make sure that they can buy, that they're old enough to buy pornography, that means people can't really buy it anonymously.
and likewise with the Internet, that to the extent that there are, there's currently litigation on this very issue, the extent that there are, attempts to shield minors from that and from other things. There's going to be spillover, at least on the privacy rights of adults. so I'm wondering what you, think about that, [00:11:00] like how, much of a sacrifice to the rights and interests of adults, both the rights and interests of adults to speak, because after all, Even adults were speaking where many children are in the audience are also other adults in the audience, presumably, and with regard to the rights of adults to themselves, consume information without undue difficulty and without intrusions on their privacy.
Ro Khanna: No, it's a fair point. I think when we look at, let's say, the specifics of social media, where you have to be 13, and let's say we move to a more strict liability for these companies if they, if you're under 13, just as a hypothetical, then there would be, and the reason the companies resist that is then it'd be harder to grow their following.
you'd have to jump through a number of hoops to prove who you were. I don't, we could say, is that too much of a. a cost burden on these businesses that are fair. I guess I'm not, that as [00:12:00] concerned about the privacy burden, because I think these companies already have so much troves of information on these people.
I, I, think I would rather protect it and say, you need to get people's consent. You need to make sure that you can't get more information than is, necessary for your function. but to protect children, I think it is fine and reasonable to say, you need to prove, in some way your age and, and your identity.
And it also has the, added value of having people actually on this and not AI generated bots, which, may clean up some of the, the toxicity of these sites, and, having, at least knowing that they're, real people. But I get the. The trade-off. it, you could, it could be that some of the regulations that would happen for, a junior Facebook site or Instagram site in likes and shares, if they, forced the adjustment to that, that, [00:13:00] could impact the broader, design of the platform.
Or that could I, I impact young people who wanted to communicate political views with adults. That, I guess I just look at the data, the mounds of data of particularly teenage girls and what. The what social media has done on the mental health and say that's a pretty compelling interest to have, more regulation.
there
Jane Bambauer: I know in your recent book, the progressive capitalism book, you lay out what you think are the most important principles going forward in a digital economy. I was wondering if you just wanted to I think our listeners would be very interested In those, what, are, as a sort of goals for, the rules of the road for the information economy that we're in now.
And maybe after you lay them out, we can talk about whether the first amendment tends to get in the way of that vision or whether it steers it in the right direction. Do you want to let, yeah, do you want to go ahead and [00:14:00] describe anything the most important principles are?
Ro Khanna: The crux of the book and the better half of it is all about.
economic opportunity and the argument that the modern economy has a concentrated economic wealth in places like my district, $10 trillion of market value, Google, Apple, in Nvidia, Tesla, in New York, and, Seattle, and that so many other parts of the country have been de-industrialized, have been left out of.
modern economic prosperity, that how do we expand, the, modern digital economy and those opportunities, to places and people left out? And how do we make sure that Americans aren't just consumers or content creators, but actually. Participating in the architecture of, of digital, wealth generation that has been the biggest source of wealth generation, at least in the stock market and market capital, of companies of, the last 20, 30 years.
And as I was [00:15:00] talking about that, people would say, okay, but. You want to expand the, the digital footprint. What about, some of the harms of, the digital, economy that you want to expand? and there I said, this is why we need to make sure that we have privacy, that we need.
Some Internet Bill of Rights of protection so that people have a say ultimately over their own data and how that data is used. People have a say on minimizing. There's some standard of minimizing the use of data beyond not needing data beyond the business purpose and having perhaps a fiduciary duty with the data that's collected.
People should have the right to have their data deleted if they want. from a company, I don't think they should have the right in, in, terms of getting, to be forgotten in Europe, where, if there's a bad article about me, I shouldn't have the right to petition Google to take it down.
[00:16:00] but certainly I should have the right to get, my data, deleted from Google's own, servers. And then in terms of free speech, what I say is, One that we want a multiplicity of these platforms that the more platforms that we have, the better it's upset that makes it harder to have constructive conversation.
But I think, a multiplicity of discourses spaces is better for a society. And, so we want to have as many of these, types of platforms, we want to minimize their. Their data collection, and we want to have as much transparency as possible. So knowing, without the details, a proprietary, details of what, algorithms they are and how they, make certain decisions, at least for big platforms, I think having that transparency is, important.
Eugene Volokh: I appreciate the, arguments for having, having more competition in this area, especially, [00:17:00] as these platforms, whether they're, Say social media platforms or AI companies. End up having very substantial potential influence on politics, right? If especially if the Supreme Court says social media platforms are indeed free to, risk to ban users based on the viewpoints that they express, that will make them even more powerful political actors.
And certainly to the extent that people will start relying more and more on, A. I. output from search engines. So basically open a I as delivered by, by Microsoft being, Google Gemini is delivered by Google. people start asking who should I vote for in the next election or what's the right way of thinking through some issue and they go with the AI output.
That's going to be tremendously, tremendously influential on politics. And if you've got two companies having that influence, that's going to be quite dangerous, but it's a practical matter. How does [00:18:00] one actually accomplish that given that there's a reason why, for example, Google search as. This immense influence, immense market share.
There's a reason why open AI and Google Gemini put together have very large market share. There's a reason why in their own particular spaces, things like Facebook and, and, Twitter, I'm not even going to call it X it's Twitter to me. and, And tiktok and instagram are so powerful, they're not going to be a lot of mom and pop companies out there, right?
What can be done as a practical matter to maintain that kind of competition that we'd like to have in the abstract, but maybe economics makes it difficult.
Ro Khanna: I would say 2 things and I, agree with you that the. There's a huge advantage to, having a large network. you and I probably wouldn't be, I don't know if you're on social media, but I wouldn't be on a platform if it [00:19:00] had a small following.
the value of, to me of being on it is that it has a large following and I only have so much bandwidth and my team only has so much bandwidth. can barely keep track of four or five platforms, let alone if there were 50. But there are things we can do still, on, to promote competition.
What one is, I don't think we should have allowed for the merger of Facebook with WhatsApp and Instagram, now at the time it's Instagram. People say, we didn't know Instagram was going to become Instagram. I think by that time when Facebook acquired it, it was pretty obvious that was succeeding.
It's actually one of the reasons I'm against the TikTok ban, because I, think having a, a Facebook or a YouTube or acquire them actually, it would mean, less competition and less platforms, on, so we certainly can restrict big mergers and acquisitions of content. of sites, we can, make [00:20:00] sure that we, in my view, have some, public, Internet space, and by that, it doesn't have to be a national public Internet, though that could be possible, but you could imagine a community, in my district, for example, saying, we want to have a site where neighbors and others can talk to each other and talk to each other under certain, reasonable content, not time, place restrictions, and, we want to talk about local sports and things about affecting the local communities, somehow, like next door is like that, but having more robust public, support for that kind of, in an architecture and a public architecture.
And that would, give, at least a public, outlet to, to provide an alternative to a truth social and, and all of the other, social media platforms. and then you want to make sure that they're just, they're being competitive and not, doing things that are, [00:21:00] vastly uncompetitive to, to prevent new competition.
Eugene Volokh: Got it. Can I ask you just a quick follow up? This is not the most important feature of the proposal, but it's just an interesting, I think, illustration of the free speech complexities when it comes to, public government run, platforms or, sub platforms or discussion spaces there, there is, there would indeed have to be viewpoint neutrality rules, right?
Yes, so it'd be like a town hall,
Ro Khanna: right? Yeah.
Jane Bambauer: For better and for worse. Pardon?
Eugene Volokh: So if indeed the defenders, NetChoice and other defenders of the, of the rights of, of the platforms, to, moderate things are correct, that indeed it's really important, to, to have viewpoint based restrictions on supposed hate speech, or just sometimes just, Personal cruelties and the like, [00:22:00] in order for the spaces to be useful, those are the kinds of things that you wouldn't be able to have.
I think on a government platform.
Ro Khanna: you would know the details of the what time place restrictions are reasonable, right? on a town hall, if someone was. hurling racial obscenities at someone else, as a practical matter, they in most places would be escorted out by, by, by police. And now I don't know what, what would be the, ability to do that or not, in terms of, both at a time.
I think that's a really good point. discrimination, in the town hall or at, in one of these spaces. And, you could probably say that, you want it to respect the, not, having viewpoint discrimination, but you want to be able to respect the ability of everyone to speak and not be disruptive.
Eugene Volokh: so I do think that probably restrictions on vulgarities and personal insults. excuse me, let's just say vulgarities generally. That would be content based, but [00:23:00] viewpoint neutral and might be permissible in a limited public forum. If you're targeting specifically bigoted, vulgarities and not others, that's viewpoint based.
But again, you might say, we don't want people to say vulgar things to each other in any event, use epithets to each other regardless of that. So I think that would be permissible. But there are a lot of other things that, that I would think that the, that would be content based. A city that's running this or a university that's running this or a public library wouldn't be able to do.
But, again,
Jane Bambauer: especially because the analogy might not be to a town hall, but rather to a town square where I think the issue of not cutting, cutting. Not having a sort of an order of speaking, and also not having a sort of captive audience like that. Some of that might go away depending on how it's structured, but it's an interesting.
Yes,
Eugene Volokh: absolutely. I've hijacked this with my little, thing.
Jane Bambauer: I think that makes, [00:24:00] so part of me wants to see governments, local governments, especially contributing these types of things. Yes. Just for nothing else. So we can have this experiment and see what happens to public debate.
but, but congressman there's, a tension between some of the things you said that at least that I perceive and I'd love to know if you see it too. So on one hand, one of the pillars of, fairness and the new digital economy that you've raised as privacy, and I think there might be some First Amendment issues, even with the right to deletion, for example, but I want to put those aside for a minute, because I also think that if we're most concerned about competition, which I think the three of us maybe have that in common, then privacy rules That make, data hard to collect in the first place, hard to reuse, make it sticky to move over.
those tend to get in the way of, of new entrants. And [00:25:00] then I'm also just not sure if we are very worried about social media, in particular, being addictive, that we actually want competition, too. Because If we have lots of platforms that are trying to compete for engagement, then they're going to compete for engagement.
And it seems that I actually, I guess what one problem is, I have not been able to draw a nice clean line between addiction and the sort of, content provision that people seem through their behaviors to actually want to engage in. And so I'm wondering if, you see a little bit of, tension or incoherence between wanting on one hand lots of these platforms to thrive and on the other hand worrying about people spending too much time on them.
Ro Khanna: Yeah, both points. I think are certainly pointed to tensions that. we have to make choices and recognize the tradeoffs. I, think, [00:26:00] privacy is such a, important value in my view for, most Americans, at least a basic privacy. Now we, having 50 different state laws, I think makes it harder.
So having a federal standard and then, here the devil is in the details. obviously if you make the regulation so cumbersome that only Facebook and Google can comply with them with their big legal teams. That's a huge advantage to the incumbents. But if you have basic laws that, you need to, get consent before data, you can't, you have to minimize data use.
and, you want to have the portability, I think, of data, but, the portability of data, which would allow for competition, may make it that the privacy, harder because then you're moving from one platform to another. I think there's a balance, but I don't think that we can just say, we want to lower the, the privacy thresholds because it's going to allow more, [00:27:00] more startups.
I think, You're, gonna have to have a threshold of any company of being able to meet it. and if we have these norms in place, maybe that'll become easier to implement. But we, need to be careful, where companies have complained about some of the European regulations being so specific.
prescriptive, that it, it, could then make it harder for new, companies to come. In terms of the competition on addiction, I think that they're already competing. They're competing with television, and they're competing with books, and they're competing with, with, social gatherings and they're competing with sports, and I'm not sure.
I think they're these engineers are already trying to maximize attention. And so it's not clear to me that, just because there's tick tock is Facebook competing more in the algorithmic addiction than not. I, guess that's an open question. If that's the case. Forcing more, addictive, [00:28:00] behavior, but I, guess they're even on the margins.
I think they're, trying to maximize time and I, rather we have more spaces, with different rules of content moderation. for example, Facebook and Instagram don't allow political posts. And if you post on the Middle East, you're probably not going to have your post go as viral as if you post on TikTok.
Now, some people are arguing that TikTok is slanting it one way or the other, but more likely it's just that they're allowing for speech, more explicit political speech. And so I think that giving people multiple outlets to express themselves in the choice, is, is a good thing. it's, it's why as much as I dislike crude social, I would never say ban that, that platform.
Eugene Volokh: so let me ask about people's choice. one thing that, makes me a bit uneasy about this talk of addiction, Especially for adults, but also even for, minors, [00:29:00] especially older minors. And so we're facing this kind of perplexing, or at least let's just say unusual situation. Here you have people who are, entities that are giving people more of the speech that they are interested in.
More of the speech that they enjoy. More of the speech that they view as, Significant to their lives, and now we're faulting them like they're doing too good a job and we say, they're, using algorithms to, to, hook people on things. Yes, they're using algorithms to figure out what it is that people really want instead of giving them things that are boring instead of giving things that are irrelevant.
They're giving. People, the speech products that those people find useful and interesting, even if we wish maybe they didn't find them quite as interesting or engrossing. If, we were to fold, let's say a, book author, Oh, [00:30:00] this person actually did a lot of research into how to make a book that's so engrossing that people will want to read the sequel.
In fact, so engrossing that people might stay up until two in the morning. reading it and lose out on their sleep and maybe won't be as good at studying the next day because this person has figured out how to provide a book that everyone wants to stay up in the middle of the night reading, I think are actually, okay, that's just a, an author that's especially successful and one that provides people with things that they actually enjoy.
How is this really that different?
Ro Khanna: Sure, but in 10th grade we wouldn't want to just assign that author. We'd force people to read Hamlet, right? And I guess the question is, at some point, we have a view of a free society where The clash of ideas will enable people to, to find their best selves.
But, I don't think all, all writing or all content [00:31:00] is equal. We just have a view that the government shouldn't be the ones, deciding what the best examined life is. Unlike the ancient philosophers who thought that it was for the government to say what the good life is. We think people should be able to find that for them themselves in, conversation.
And we have great trust in our. Adults to do that and the genius of ordinary Americans, at least collectively, but we don't have that same confidence in, in, in our, in our youth. I think we can be much more prescriptive when we say that, you don't want your 9th grade teacher just talking about the Giants or the 49ers.
Even if that's what the kids would want to talk about. and, and to some extent, this is, giving kids, the young folks, the content that they, yes, they may must may most one, but not necessarily what they most need. Which is a prescriptive, value, but our entire school's curriculum is prescriptive.
Eugene Volokh: But this isn't just about the school's [00:32:00] curriculum, right? The attempts to restrict the social media are about things that kids will end up watching and reading outside in addition to the school curriculum, perhaps to be sure in competition with it. But usually the school's power to restrict the curriculum doesn't translate into the government's.
Power to restrict things that are consumed outside of
Jane Bambauer: although interestingly, I do think that a lot of parents are, supportive of this kind of regulation because they don't like having to say no.
Eugene Volokh: Maybe that's right. Especially if
Jane Bambauer: they're the first parent to say no. That's true. I do wonder if that's true.
there could be some, collective action problems here that might be explaining why these bills are so popular. Although, I have seen some parents band together and agree to all
Eugene Volokh: At
Jane Bambauer: the same time so there's speaking of
Eugene Volokh: antitrust violations That's a cartel if I've ever heard Do
Ro Khanna: you think putting aside [00:33:00] the first amendment concerns?
How concerned are you? in the formation of a young person that social media is contributing to mental health or worse education or issues? Or do you think it's being exaggerated that there are other issues that explain that?
Eugene Volokh: those are, that's a very interesting and important question. And I, totally appreciate it.
It's an empirical question, setting aside the first amendment, maybe even in light of the first amendment, it should be still an empirical question. but, certainly setting it aside. It is an empirical question. I'm certainly open to the possibility that maybe it really is bad for kids to, to, watch too much social media or watch too much television, or maybe even read to read too many comic books.
And remember that was used to be the objection. It seems quaint to worry about that. Now, it's certainly possible. No, I, entirely appreciate the point that, maybe our view that more speech is better. that's a [00:34:00] good first cut as an empirical matter, but
Jane Bambauer: empirically, we have to be
Eugene Volokh: open to the possibility that it's not.
What do you think, Congressman? what's your sense of what the real practical, problems here are the practical harms end up being because I expect you've studied it more closely than I have on the empirical question.
Ro Khanna: I don't have more empirical data. I guess I might as a politician.
I do have a lot of anecdote of anxious parents who come up to me all the time and say, I, have, particularly with teenage girls. I have them or young kids are going out during dinner and posting on Instagram and, they're obsessed with it and spending hours on it. and, it's, hard.
There's this, a great line. I forget who was, who talks about how always the previous generation is judgmental of the next generation, and they think that everything is, is terrible [00:35:00] about the next generation. To your point, there was a time people were really upset with TV and comic books. And so it's hard for me to distinguish, how much of this is just, that there is a new medium.
there was a young person I met who said she spends one hour, every day disciplined and, going through TikTok, but not more. And I certainly spent more than one hour watching television growing up. So she said that was the way she relaxed. But I guess I've heard too many stories. From too many parents of real anxiety for me, not to think that there is an issue for government to be involved.
Jane Bambauer: So I'm pretty skeptical of the existing evidence base, though. I think I actually really admire Jonathan height and, gene twin twingy for, Doing all that they've done and they have this open source list where people can actually not only look at the original studies, but even annotate what's, the weaknesses and the, strengths of each one.
And, [00:36:00] but, still overall, the effect sizes, that they're pretty noisy, they're flawed in various ways as a whole. They tend to suggest that there is some negative effects, but that they might be small. And I guess the main reason I'm skeptical though, has nothing to do with that, but that.
Being a teenage girl is always fraught with danger, and I think parents will always look for something that explains why it's so bad that, they'll look to something that is new and could explain something that's actually an, a longstanding phenomenon. and, and yeah, so I personally think.
Even though I myself am scared to watch my own daughter go through these ages, and, hate to think of, the pressure she's going to be under to, at least appear happy and, beautiful or whatever it is that she wants to look. I, still think that overall the evidence, is not quite at the point where it makes [00:37:00] sense to, to do such a harsh.
Ro Khanna: But when you see the studies and I've, this is an open question here in New York times, every couple of months they have this. It's a very interesting statistic that one third of teenage girls have had the ideation of suicide. Not, that doesn't mean that they've had a medical condition or seen a doctor or called a hotline, just to have that thought.
And then it's a one or two percent who actually needed, intervention. Do you think that is more than 20 years ago, 30 years ago? Do you think it's a crisis or do you think this is just something where, young girls have always had that challenge in American society?
Jane Bambauer: Yeah, there's so yeah, adding on to the question of what social media does is this question of whether suicide ideation and even suicide commission has changed over time, because there was a change in how this information is, Is observed and reported.
And [00:38:00] anytime there's a change like that, it's hard to know the ground truth. And I, there was a, there, there's another podcast that goes over the evidence base of certain things called, the studies show that did one on, on, okay. that one, Eugene, it's really good.
Eugene Volokh: It's a lot of fun to listen to. I don't know how legit they are, but they sound legit and it's a lot of fun to listen to.
Jane Bambauer: I think they're quite, they're, pretty careful, but they are also pretty skeptical. So according to them, like nothing ever happens, basically nothing ever changes. But, but they do explain, I was actually surprised at, at how uncertain we should be about whether suicide rates and ideation are going up at all.
And then, and then if they aren't, then we don't have much to explain. that's, I am not a specialist at all in this area though. So I think we're all, fumbling around a little bit at this point,
Eugene Volokh: Congressman. [00:39:00] We've already taken up a lot of your time, which is a very precious commodity on.
We very much appreciate it. Let us just give you the closing word except for our little procedural thing at the end. Tell us what you think we should be thinking about and what our audience should be thinking about going forward about all of these things.
Ro Khanna: first of all, I appreciate your having me on and indulging an amateur in free speech doctrine, where both of you have obviously dedicated your lives to understanding the jurisprudence.
And one of the things, Eugene, you had said, which made me think, is that there are a lot of things in our country that are, Some things that could be better, but our free speech jurisprudence with a lot of jurists having thought about it is probably one of the more exceptional parts of America. And so I think that, we should have humility and, in offering, new ideas about it.
Not that there shouldn't be new ideas about it, [00:40:00] but we come with a pretty. robust framework when it comes to free speech, arguably the best in, in, in the world. I guess I would say that at a time of increased polarization, at a time of increased anger, and at a time of increased insult, and in what will certainly be one of the ugliest presidential elections in history, By modern lifetime.
We have to find more forums like this this will not go viral on social media when we post it if we do and in and other things much more provocative Would and that may not be a function of just algorithms and blaming companies and maybe a challenge for a collective imagination Of how do we lift up a more reasonable dialogue, which comes at a difference with respect, and some sense of understanding where other people are going for coming from.
And that to me is the biggest challenge as I think about it in the next [00:41:00] decade. for our nation
Eugene Volokh: makes perfect sense. So words to keep in mind, congressman, thank you so much for sharing your time with us. and, thanks to everyone for tuning in. Do people still say tuning in? It's a strange little technological leftover tuning in to our podcast and, looking forward, to the next one on free speech on muted.