Big Tech Censorship: Is Treating Platforms as Common Carriers a Solution? – Eugene Volokh Interview

Eugene Volokh is founder and coauthor of The Volokh Conspiracy, a leading legal blog. His law review articles have been cited by opinions in eight Supreme Court cases and several hundred court opinions in total, as well as several thousand scholarly articles. Eugene sits down with Alex and Nick to discuss his recent article in the Journal of Free Speech Law that looks at the pros and cons that could occur if the U.S. were to treat social media platforms as common carriers. Companies distinguished as common carriers are generally required to provide service to the public without discrimination.

Originally Aired: 08/12/21
#FreeSpeech #Interview #Podcast

Subscribe to the Applico YouTube Channel

Full Transcript:

(This transcript has been AI generated, please excuse any typos)

Alex Moazed (00:08):
Hello. Welcome to Winner Take all, where we talk about the constant battle to fight back and win against big tech monopolies. Really excited to have a special guest today. Professor Eugene Volokh. Professor, thank you so much for joining us. Thank you very much for having me. We also have Nick Johnson coauthor with me on the book, modern monopolies, Nick. Uh, great to have you with us as well, professor you, uh, you know, have clerked for justice, a Supreme court justice Sandra Day O’Connor judge Alex Kozinski on the us court of appeals for the ninth circuit. You’ve been a professor at UCLA for many, many years, you know, a thing or two or three about what it takes to live by and protect the first amendment you teach first amendment law at the UCLA school of law and have recently written a paper titled here, social media platforms as common carriers, question mark, recently published in July of this year. So professor, what did I miss about yourself? And, um, you know, what prompted you to, to publish this paper? On a, on a very interesting topic, one that is very near and dear to our heart here on winner take all

Eugene Volokh (01:27):
What prompted me is everyone’s talking about these things and rightly so, it’s a very difficult question. Uh, it’s a question partly of first amendment law, but also partly more broadly of free speech people correctly. Point out that if Facebook or Twitter kicks you off the platform, they’re not violating your first amendment rights. The first word of the first amendment is Congress. Congress shall make no law. That’s been applied through the 14th amendment, which mentioned states have been applied to state and local governments, private entities, aren’t bound by the first amendment. If a private university expels you, that doesn’t violate the first amendment. If a private shopping mall kicks you out, it doesn’t violate the percentage, but, uh, these kinds of decisions by private entities might interfere with the free speech might interfere with the social phenomenon. If the people speaking for you, leaving able to participate in democratic self government, through their speech, being able to contribute to the marketplace of ideas, of course, contribute both well and badly free speech sometimes to protect.

Eugene Volokh (02:26):
So all sorts of things that are bad, as well as good, like, like so many things. Um, uh, so many technologies, so many human behaviors can be used for good as well as for it. Um, so one question is to what extent should we be worried about social media platforms, restricting speech on those platforms? Another related question is to what extent do they have their own first amendment rights to restrict these things? Just if you look at us, think of a spectrum of platforms on one end is let’s say the newspaper, which is a platform, not just for its own writers speech, but also before op-ed writers for letters to the editor writers and the columnists. Well, they not only, uh, do decide what goes into the newspaper and decide to exclude certain things. They have a first amendment, right to do that the Supreme court expressly held that you and I had a misleading case called Miami Herald versus Tornillo in 1974.

Eugene Volokh (03:25):
And what’s more, it’s very valuable to us as readers. So they exercise this right. Um, and newspaper is all about what it excludes as well as what it includes. If I open up a newspaper, I wanted to see things that they’ve added for quality. And if I open up an opinion magazine, I may also want to see things that are vetted for ideological consistency with their message, because presumably that’s what I want from that next. Uh, so that they’re, to the extent they restrict speech that goes on in their papers or excuse me in their pages, uh, that’s a first amendment protected, right? To restrict speech the way its self, a valuable contribution to the marketplace of ideas, because it allows us to have a new Republic on the left and the national review on the right and reason magazine from a libertarian perspective, that’s really very important.

Eugene Volokh (04:12):
Let’s look at the opposite. End of the spectrum in a phone company, a phone company can say, well, we don’t like what you’re saying, because you’re a communist because you’re a racist or because you are an T4 or whatever else, or because for that matter, we don’t criticize the phone company. We don’t like that. So we’ll cancel your phone line. They’re not allowed to do that because they’re so called common carriers, they’re required to carry everybody is subject to various neutral rules. Like you gotta pay your bills. Um, uh, and that’s true by the way, not just because of privacy reasons, uh, uh, a newspaper, excuse me, a phone company could find out that say the KKK or the car. I mean, as party or using their phone lines, just through public information, there is a webpage that says, call this number. Uh, so to hear our message, call this number.

Eugene Volokh (05:04):
If you want the joint urban, they can’t just say no. Can we refuse to allow our property to be used, uh, for, um, uh, for these evil ideas get, because they’re a common carrier. Another example of that is ups and FedEx. They also are common carriers. They can say, we refuse to deliver things from anarchist bookstores or something like that. Uh, now I’m a big believer in private property rights. I think private property rights are very important. And I think there’s a plausible argument that these companies should be allowed to decide what to carry on their property. This having been said, we’ve restrained a private property rights in some measure, by the way, not just because they’re monopolies, traditional landline phone companies used to be monopolies. These days. They’re usually competing with cable companies, but the same rules apply to famously competitive entities like FedEx and ups.

Eugene Volokh (05:55):
And for that matter, like cell services, which compete with each other, they too are commentaries. So the sense is that these large powerful companies, which at least put together often have a massive, massive share of the market or not be able to use their economic power, uh, as a means of excluding competitors in some respects, but also, uh, as a means of controlling the political marketplace of ideas. And one interesting question as to what extent should we take a similar view with regard to social media platforms, or maybe just with regard to some functions of a social media platform. Maybe we should have a different view as to when, when a platform decides whom to host and who not host versus when it decides what to recommend and what not to recommend. Maybe platforms do have first amendment rights when they say, we suggest you look at this, or lots of people are looking at this or people who have your views might like this. Maybe they do have first amendment rights to pick and choose us to those recommendations, but should, but don’t have first amendment rights to just kick somebody off the service altogether. So this is what I’ve been trying to explore in my work.

Alex Moazed (07:03):
Yeah. You bring up, you know, just a couple things just to break that down a little bit further, you know, I think what you’re alluding to is, you know, we’ve seen actually a spectrum of censorship across, uh, content platforms, social media platforms, like a Facebook, including now communication platforms, like a WhatsApp. You’re seeing it on Amazon. You’re seeing it on Twitch. You talk about, you actually give a number of really good examples in your paper, but there’s a spectrum of censorship, right? At one end of the spectrum, it’s kicking, uh, users creators wholesale off the platform at the other end of the spectrum, there’s shadow banning there’s, you know, muting, um, how much visibility a post gets or not recommending. And as you mentioned to as many people, et cetera, et cetera, right? So we’ve seen a spectrum of censorship. And I think, you know, some of your papers is, is discussing where on the spectrum, um, is appropriate. Some of what I took from it is that, you know, there, there should be a, a minimum threshold of common carrier application at, at, at the most basic level of the spectrum being the hosting bit. Are you still exploring that kind of theorizing around that? Or do you think that, you know, these common carrier provisions and precedent, you know, should apply at the most basic level? I E hosting,

Eugene Volokh (08:31):
I’m not sure. And I’m not sure in part because, uh, I get, I’m a big believer in private property. I’m a big believer in free markets. Generally speaking, I think free markets are better than government regulation. Although of course, there’s the question of how much government regulation is needed in order to make sure the market really operates in a, in a freeway. Uh, but, but I find that, uh, lots of problems are made worse by attempts to regulate rather than new made better. So it’s certainly possible that if you set up these kinds of common terror regulations, that’s going to interfere perhaps with a new entrance, new possible competitors might find it much harder to deal with all those regulations. Likewise, let’s say you have a rule that says anybody who gets kicked off for ideological reasons or have as opposed, blocked for ideological reasons.

Eugene Volokh (09:25):
Um, consu. Well, one problem with that is that sometimes there’ll be posts that are removed for non-ideological just for example, uh, platforms have an obligation to remove, uh, material that they have good reason to think infringes copyright. So they do that. And then somebody says, no, no, no. The real reason you’re a moved to just because of my politics. You didn’t move with this other posts that was supposedly similar, similarly entrenched drop here, but from a different political perspective, aha, I’m going to Sue you so you can have all of these lawsuits, uh, sometimes brought by people who are pro se representing themselves, who don’t have a lawyer they’re less like even less likely to win that way. But at least the barrier of having to hire a lawyer wouldn’t even apply there, there’ll be all these lawsuits that these platforms are going to have to deal with.

Eugene Volokh (10:15):
Uh, and it may be expensive for them. It may be expensive for the legal system. It may push them to, uh, to not get into certain kinds of hosting of user generated content or set up other rules that might be counterproductive. So I think we should always be cautious before any big regulatory move. Uh, I think that the argument for common carrier treatment as to the hosting a decision is a plausible arc. I think there’s very good reasons for it, as well as the reasons against it. I also think that as a constitutional matter, if Congress wants to experiment with that, and maybe even if states want to experiment that, that’s also, by the way, a separate question, can it be done at the state level that doesn’t have to be done on the federal level by may, but if they experimented with it, I don’t think that there’s a first amendment problem with it, assuming they do it right. There are some ways they’re doing it, that may be wrong. Um, but, uh, uh, I have, I have to say I’m quite tentative on the subject because, uh, you know, there are always unexpected, unexpected, possible consequences of any form of regulation and on balance. Uh, uh, it may be that it will do more harm than good. I have to admit that that’s definitely a plus.

Alex Moazed (11:26):
I’m going to err on the side of probably not more harm than good given how powerful and how overbearing and, and how many transgressions we’ve seen, even if you take it out of the political lens, which obviously has been, you know, the, everything has become so political these days, but you know, one of the examples we’ve touched on in the show, we’ve had some guests come on, the show is actually the crypto community. Um, so get away from COVID away from politics and, and the amount of censorship that’s going on there. But the crypto community has been censored by Google and Facebook and the like for years, you know, uh, we’ve had guests on the show who had hundreds of thousands of subscribers, you know, who have created hundreds of thousands of videos. Um, and they’ve had their entire channel, their entire following wiped out all of their content, not even to mention their following wiped out, but all of their content, um, which, you know, they had uploaded to YouTube, for example, and all of that just gone because the powers that be at the tech monopolies and these content platforms, decreed that crypto type content was infringing on, you know, one of the various, uh, um, you know, versions of their privacy policy or user standard policies, you know, when we take it out of the political landscape, or if you talk about religious freedoms, right?

Alex Moazed (12:51):
I mean, there are a whole series of transgressions, um, by these big tech content monopolies outside of just the political mainstream discussion that has consumed everyone of us. Uh, the, you know, the past couple of years here, I definitely am very firmly on the side of, you know, these tech monopolies have infringed, they’ve opened up Pandora’s box. They, you know, they’ve actually gone against the very grain of what it means to be a platform business, which is to facilitate the exchange of ideas and bring together consumers and producers. And I’d actually say they’re probably in violation of their platform status as provided to them under section two 30, but that’s a whole other discussion.

Eugene Volokh (13:36):
Well, I’d be happy to talk about that because actually I think that generally speaking, they, uh, uh, they are protected by section two 30 esta many, many facets of their operation. One interesting question. You should be 30 to be modified in certain ways.

Alex Moazed (13:51):
Well, the reason why they’re protected by section two 30 is because there’s, there’s like three words in section two 30 would say, and any other harmful content, right? I mean, section two 30 was put into place to protect against child pornography and elicit, you know, pornographic and, and material, particularly with children, right. That was the whole impetus of two 30, over 20 years ago now. But the language was written so loosely that big tech has used it to cover everything under the sun and including crypto content on YouTube. If you want to say, yeah, any other harmful content talking about crypto community and an alternative financial system is considered harmful to the stability of the United States. Okay. But in what was the true impetus for section two 30, they’ve asked absolutely violated, you know, the, the, the reason why these protections were put into place, but still to let them operate as a platform. So in that regard, I’d say they’re totally on the line and publisher

Eugene Volokh (14:50):
These days. No, actually I don’t think that’s quite right. So I agree with you as to one thing, section two things. So, first of all, let’s step back, section two 30 has two most, two particularly important privilege. One is that it says that platforms, but also other, for other companies that you don’t really deal with, social media platforms, other companies, even individual users, aren’t going to be responsible as publishers or speakers for material provided by someone else. That’s what allows an entity like you to branch of Google, but let’s talk about it separately as YouTube or Facebook or Twitter to function because otherwise, anytime somebody says something defamatory on Twitter, Twitter could be sued just like a newspaper could be sued for publishing defamatory letters to the editor. And the Twitter would never have been able to get off the ground that way. Uh, so that’s an important question.

Eugene Volokh (15:37):
So I’ll see one, because that’s subsection C one section two, section two 30 also says that they’re going to be immune from liability taking stuff down. So two 30, see one says they’re beautiful from a liability for it. Keeping stuff up to 30 C2 says, um, they’re immune from liability for taking stuff down. If the material is lewd or excessively violent or harassing or otherwise objection. And actually just this morning, I was finishing, editing on a co-written peace of mind for the same symposium where we say that section two 30 C, two, doesn’t give platforms the right to block everything that is objection about just because they don’t like it. It only gives them the right to block things that are otherwise objectionable in similar ways to the other list of adjectives that were in this dictator. So we do think the two 30 C two provides limited protection so they can block pornography.

Eugene Volokh (16:35):
They maybe can block all guarantees. They can block material. They found politically. So on that point, we agree. But even if they don’t have this two 30, see to me unity for blocking, let’s say, I don’t know what exactly they’re doing this to put down some of the time. At least some of the stories turn out to have a different explanation. I needed someone claim of copyright infringement or whatever, but let’s say there are blocking material having to do with crypto because they don’t like it. Two 30 C2 doesn’t protect them. But the problem is under current law, there’s no law that prohibits them from doing so as a result, it’s not like they’re violating two 30 by blocking two 30. Doesn’t say you, that may not block anything unless it’s sexually themed or excessively violent or whatever else. It just says. We’ll provide you with immunity from contrary to state law, if you remove material, but they actually aren’t a lot of state laws. In fact, very few state laws can even be argued to currently impose obligations on platforms to host. So the question really is what new laws, if any, should be enacted in order to impose such obligations. But right now, I don’t think the black friends are violating any laws. And again, we can talk about some possible theories, but there were pretty, pretty much stretches of theater.

Alex Moazed (17:52):
If they violate C2, right? For taking down content that doesn’t fit with the ethos of, of, of, you know, the, the primary language, then that doesn’t invalidate their protections as a platform in C1,

Eugene Volokh (18:07):
Almost exactly they can’t violate C2. C2. Doesn’t tell them you may not take stuff down. C two tells them we will give you immunity from lawsuits over a supposedly improper takedown. And again, not that the undercurrent while there are going to be many such losses, but if they’re already, we’ll give you immunity. If you’re taking it down because you consider it to be, uh, harassing or, uh, or lewd or, um, accessibly violent and such. If you take it down for other reasons, if you’re not violating C2, you just don’t get that C2 protection. And, uh, if you do take it down, for some other reason, you don’t lose your C1 protection, uh, uh, that the C1 protection is independent of the C2 protection. C one says, you’re not liable for things you keep up, unless you’re the one who actually created them. C2 says, you’re not liable when you take things down. If you take it down for certain reasons, if you take it on for some other reasons, then, well, maybe if there’s some law that bars you from taking it, taking it down, then you could be sued for that. But, uh, it doesn’t undermine this C1 protection that you get

Nick Johnson (19:20):
Well distinction, which you alluded to earlier between the hosting function, the recommendation function and the conversation’s function. And then maybe see maybe a section two 30 could be adjusted to apply to certain of those, but others tell us a little bit more about those distinctions and how you sure.

Eugene Volokh (19:38):
So, um, uh, let’s say that a platform, uh, blocks me from being able to post things on the platform and then being able to reach to people who subscribed to my feed. You know, again, I don’t think it has a first amendment right. To do that. I’d be happy to talk about the precedence if you want to, but I just don’t, I don’t think it has a first right to do that. And I don’t think there’s a lot of real value created by there blocking people from, from posting things. I mean, I suppose if you think people are posting bad things, then you think it’s valuable for entities to be able to stop that speech. There’s a value in that. But I think on balance, our free speech principles are out. We don’t want either the government, I think, or super powerful, uh, entities that are close to monopolies in their own niches from being able to control the pain. Uh, so, so other than just as a moral sense, we want to completely disassociate ourselves from the speech. There’s not much value there. And again, the analogy is, is the phone companies that we generally speaking, don’t let phone companies block people’s phone lines, even within leave innovating the phone lines are being used to spread back. So that’s the hosting function. Let’s look at the recommendation from, let’s say, I go on YouTube or Twitter and I’m interested in finding new.

Eugene Volokh (21:02):
There is real value in them, recommending things that they think are good. Uh, and maybe not recommending things that then they have good reason to think are lies or just junk, or even it was in the viewpoints that most people find highly appealing. And so at least many people find out about the offensive there’s real value to users in being able to follow those recommendations. Um, um, and having there be a curated set of yeah. Recommendations. Cause after all, if all the recommended was all read them page, that wouldn’t be very useful to you would, it wouldn’t be a good recommendation. Again, that’s similar to what newspapers do and what mag is deemed to do in, uh, in, in choosing what, what to cover and what the presented. So that sort of thing is to introduce both, I think, first amendment protection. I think those recommendations are the company’s own speech.

Eugene Volokh (21:55):
And also I think real value that we don’t want to stifle by imposing some sort so common carrier now then there’s the conversation function. So, uh, the classic example is, uh, comments that can be put onto, uh, onto somebody’s Twitter feed or somebody’s Facebook page by people who could either be completely unrelated to that page. Or even if, technically they have to send in a friend request or some such, it may just be that there are going to be lots and lots of requests. So a lot of them are going to be built or not carefully scraped one problem that I think people correctly point out and I’ve certainly seen it. I’ve been moderating online for twenty-five years now, uh, in various forums is that if people can just post anything they want there, that makes that discussion less useful for most readers. The classic example of that of course is spam, right?

Eugene Volokh (22:48):
That if you don’t have some sort of filter of the comments on other people’s pages, they get drowned out in spam and become much less useful to people. Uh, no. The difference with the hosting function that, that, that, that happened to be maybe some spam Twitter feeds or whatever else, not a problem for me, because I’m just not going to all of them, but there’s going to be all the spam posted on Twitter feeds that I read, whereas comments. So the Twitter feeds I read, then I’m not going to read those beads or those comments. Another related thing is personal insults and they could be racist, let’s say, or anti-gay or whatever else. It could just be other kinds of things. My sense is that that poisons the conversation in many ways. Yeah. And people could just ignore them, but it’s not easy to ignore them them all together.

Eugene Volokh (23:32):
And, uh, uh, the result is that people will be less likely to engage in that kind of conversation. So there is, again, real value in the platforms providing some curation for, uh, for these comments. So, uh, uh, so I think, uh, those are just examples and we can go into much more detail and anybody who, if there is a central law would have to, somebody would have to come up with a defined in much more detailed, but I think it’s a reminder that there are different things that those companies do and it’s to certain things. It makes sense to say, look, just open it up to everybody. But it’s the other things. If you open it up to everybody again, Hey, including the spammers, uh, that’s going to, that’s going to ruin the whole experience when the companies, but also running the experience for the user. So, so that’s what I think we need to be cautious and just saying, oh, we have to completely eliminate their discretion, uh, platforms, discretion. I think that would be a bad.

Nick Johnson (24:29):
You also mentioned, uh, I think in your paper talk a little bit as well. Some of these companies have kind of stepped into this vacuum. Facebook kind of famously has created its own content moderation, uh, kind of council that mark Zuckerberg is kind of just called the Supreme court of Facebook. What’s your take, what’s your take on that kind of, uh, self regulation. Do you think it will be effective, hard

Eugene Volokh (24:54):
To tell there’s a famous line by justice Oliver, Wendell Holmes, um, uh, about the first amendment than about the constitution. It is an experiment as all life is an experiment. So I welcome experiments. I, I’m not sure how successful it’ll end up being. I do think so far, it seems to have reigned in some of Facebook’s, uh, over censorious qualities at the same time. You know, the one way of thinking about it is, uh, given how vitally important Facebook is to just American political life, including election campaigns, including very close election campaigns, where the ability to use this medium or not may very well make the difference in a country that’s split. Uh, you’ll look up at the house in the sense that they’re split, uh, uh, literally 50 50 that has a very close to it. Um, so, uh, who gets to decide what are the criteria that are used to decide what political candidates and other important players in public debates, including ordinary citizens get to say in this important piece?

Eugene Volokh (26:00):
One possible answer is it’s decided by, uh, the art, the legal rules, the first amendment law. So maybe a few, some regulations are permissible on liable copyright for TriNet and such, but generally speaking speech is protected and it is a, and that includes bad speeches. So that’s one possible. A second possibility is mark Zuckerberg should decide maybe not just mark Zuckerberg, it’s always more complicated than the one guy. There are the shareholders, but also other influential employees within an organization. Even if the, even of the boss is the boss or the sentiments of other employees are very important because he wants to keep his employees, uh, happy cause otherwise they’ll leave and that’s bad for the company. Uh, so maybe it should be kind of the Facebook higher-ups and just in general, the tech leads and I don’t use the lead by the way, pejoratively.

Eugene Volokh (26:57):
There’s a reason that they’re lead. A lot of them are really very smart people. Um, uh, maybe quite well-intentioned. So you could imagine that you could say, look, you know, this is our we’re in a private property system, of course, private property owners and those people who have their gear, uh, shouldn’t make these decisions. Uh, a third possibility is let’s have it be decided according to principles of so-called international human rights law. And that’s what Facebook is appealing to. So it sets up this board. I think it’s 20 members right now, which only a quarter Americans perfectly understandable because after all Facebook is a worldwide company, they’re not applying generally speaking first amendment law, American first amendment law. To the extent they’re applying by analogy and you’ll be go rules, the rules of international human rights law, which are set by again, a certain set of kind of international, uh, legal elites that protects certain kinds of speech.

Eugene Volokh (27:47):
Don’t protect certain other kinds of speech favor, certain ideas, disfavor, other ideas, again, you know, maybe they’re right. Maybe those rules are acceptable rules. I’m just skeptical that that’s how American elections should be run. And American political life should be run with an intention to what foreigners think about what international norms should be. And I, and I would apply that to any other country do, I don’t think that the French should tolerate French elections being run, uh, by applying the standards of what Americans think the rules ought to be. Uh, so, uh, so I’m not wild on balance about the oversight board. Maybe it’s the best of a bunch of, of, um, uh, options that have each have their own problems. Maybe it’s the best solution. But I do think one concern we should have is that this is American elections being run, not by the, kind of the well-established rules of American of American first amendment law, but being run by these decisions of these powerful companies that delegate these decisions to these lawyers, some of them, by the way, I know in personal life very much, uh, but, um, uh, I’m, I just I’m skeptical that that’s the right way of running our political system.

Alex Moazed (29:04):
So to be qualified as a common carrier, this would actually need to be passed as a law, either at the federal or state level to label could either be specific specific companies, or it could be a specific class of company that could be labeled as a common case.

Eugene Volokh (29:21):
Right? Although let me just suggest it, by the way, part of the problem was is the title, which you accurately quoted the earlier title of revise, the title, something hasn’t been published yet should be published on a couple of weeks, which is treating social media platforms like common carriers question. Cause I, cause I find people saying, well, no, no, the legal definition of common is this embedding. You’ll look up certain court cases that say, yeah, it’s defined this way. Other case may says to find some other way, the, to me the question isn’t are they common carriers? And with some objective to finish it, should they be treated like that? And that is a decision that, again, it might be made by Congress, it might be made by state legislatures. So you’re right. But the question is at this point is what laws, if any, should these legislatures in.

Alex Moazed (30:06):
And so, you know, similarly, right? You, I mean, you talk about cell phone companies, telecom companies, right? They have similar, a lot of similar attributes that the social media platforms have in, in terms of the negative impact of being able to come and carry, or for example, you know, we all get spam calls, we all get spam text messages, right. And so how do telecoms handle companies or users that are abusing the, the cell phone or telecom industry and how do they treat these bad actors that are just a nuisance to the network, uh, versus, uh, you know, kicking off the, the hosting right or privilege that comes with common carrier status from users that are carrying opinions that, that fall outside the lines of, of that platform or that common carrier. Right. So do you think that these problems are, they don’t seem insurmountable to me? I mean, you address a number of them in your paper to kind of, I think get out in front of what some of the, uh, concerns would be, or, or, or, or negative impacts of, of being labeled to come and carrier would be, you know, it seems like there are ways to work through that under the common carrier status, and there might be some precedent to provide some guidance about how best to do that. Is that your, your general field towards it, or what’s your sentiment.

Eugene Volokh (31:23):
Yeah. I’m going to say this is something that’s worth experimenting with. Uh, and, uh, I do think that, uh, you need to make sure that things are written or drafted the right way. So for example, there’s a Florida law that would show blocked by a Florida, uh, but a federal judge in Florida. And I think quite correctly blocked in part because that law limited platform’s ability to respond to things that have limited their ability to say label sock, some tweet with, uh, we think this is wrong. And here’s why that is the exercise of the platforms own free speech rights. Now sure. They can exercise those rights in biased ways, but the first amendment includes your right to speak in biased ways. And in fact, some of the time does speech that they add that they contribute is a valuable thing to users and the public debate that if they think that something really is wrong, I think the proper solution is for them to say, here’s why we think it’s wrong rather than to delete.

Eugene Volokh (32:19):
Um, so likewise, I’ve seen some proposals that say, well, platforms can’t can to, uh, block, uh, uh, viewpoints, well, except viewpoints that are pro terrorists or viewpoints that encourage self-harm or something like that. You can’t have a viewpoint neutrality mandate that is itself viewpoint. That itself excludes certain kinds of viewpoints from protection that I think would be unconstitutional. So a lot depends on how you’re going to implement. Also, you don’t want to put the, again, the platforms in a position where, uh, where they’re not just going to be inundated with a lawsuit, but potentially very much, uh, very much, uh, um, uh, damaged by even innocent mistakes that are going to happen. One of the things that, that we have to recognize is that so long as black friends didn’t have any ability to restrict things and they have to in part for copyright reasons, again, it, current copyright law basically required some recent blood here, but basically requires them to remove certain material.

Eugene Volokh (33:20):
That’s infringing, copyright. Even once they’re on notice of it, they’re going to have to make these decisions. And they’re going to be errors. There’s going to be somebody who perhaps just some, some individuals line operator who blocks something doesn’t block something else. And maybe the reason is he just doesn’t like the ideology of the thing that he’s blocking. So you don’t want to have a situation. For example, imagine you’re saying, well, the moment you discriminate against any material based on viewpoint, you lose all of your immunity again, libel lawsuits against defamation lawsuits. So just because of this one error, you, you ended up costing you hundreds of millions of dollars because of that, that’s not fair, I think, and that’s not something that’s going to be effective because that’s gonna make a platform. So, uh, too reluctant to delete things that they should be doing.

Eugene Volokh (34:07):
Uh, so, so the other thing to keep in mind is if you do do it at the state level, we have to think about how you keep states from exporting the rules outside of the borders and how you avoid a situation where, uh, one state has one rule in another state does an inconsistent rule. Imagine that for example, uh, the state of California says, uh, platforms, you, you must block certain kinds of material. And the state of Texas says platforms. You may not block certain kinds of materials. Uh, well, what did they do? So I think there are ways of dealing with that geolocation technology, the technology for figuring out where are users coming from is far from perfect among other things, because people can use technology to hide their, like location of virtual, uh, virtual private networks and the like, and partly also, because, you know, there are cities that straddle, uh, uh, that straddle the state line as though you may not know where somebody is coming from, but I think it’s close enough that you can imagine state laws focusing on speech that is in some sense within the state, which again, you’d have to define specifically where that means.

Eugene Volokh (35:21):
Uh, so, so you need to make sure that anything that’s done along these lines is done carefully. I think at one advantage of having it be done in a state by state basis, there could be some degree of experimentation. So as a result, you know, North Dakota try something and it totally screws up to the point where north Dakotans can’t have useful conversations anymore because they’re inundated with spam or with personal insults on their Facebook pages. That’s I could say thank you enough Dakota. We’ve learned something premiere for experience.

Alex Moazed (35:47):
I agree. I think, I think the states are a great place to trial this out. Uh, Nick, were you going to say something part of the

Nick Johnson (35:53):
Challenge that we see here in the state of affairs today is what I would call the lack of transparency and kind of almost arbitrary nature of the platforms power. If Facebook decides to kick me off, I don’t necessarily know why, um, you know, very vague reason,

Eugene Volokh (36:09):
But it

Nick Johnson (36:10):
Doesn’t necessarily explain to me exactly what it was that I did was wrong. For example, um, how would you view or what would you view as potential solutions to that problem?

Eugene Volokh (36:18):
Right. So, so that is something that people have been talking about. Some people even say, look, uh, we don’t want to regulate, uh, their ability to block certain viewpoints because we think there’s value to having, perhaps at some point, especially if there is going to be more competition to having a left-wing black from an irregular platform. Uh, one problem that I, the one thing that really kind of influenced my thinking is the experience that parlor had, you know, but people used to say, well, if you don’t like Twitter’s ideological, uh, restrictions we’ll start your own. So parlor gets started. So, and then it got blacklisted by Amazon web services and Google play store and an apple store, uh, um, uh, ad finally managed to get back to spike that as a shadow of its former self, as best I can tell, but, but it managed to survive that at all, only because it has apparently a billionaire investor behind, uh, and the message I think has allowed us and sent loud and clear that if you th that a lot of these big tech companies are not going to tolerate, uh, uh, competitors or people who have at least a different ideological perspective.

Eugene Volokh (37:24):
But if we got a pass that maybe it’s good to have kind of a liberal platform and a conservative platform and libertarian firm and a no holds barred platform, uh, but in order for that to work well, we want to have a real sense of what the actual rules are as they’re actually being applied. And one thing that really frustrates me as a policy analyst is people often say, oh, I was banned because of X. And I’m not sure. I mean, I, I need to be skeptical about these claims. If somebody says I was banned because I was posting about crypto because I was posting about gun rights or whatever else I need to stay in really well, how do I know that? So maybe you were Ben for some other reason, or maybe it was just some technical quits, or maybe it was just an error, a human error that if you only appealed would have been promptly solved, we have techs can accept some kind of such human errors, but given the very limited response that the platforms offer often, it’s very hard to tell what’s going on.

Eugene Volokh (38:20):
So some people say let’s allow them to impose whatever rules do on digital, have to be transparent about that. There are limits to transparency because part of it, part of the problem is that, uh, some of these decisions they could say, well, we blocked it because it violated this rule. Well, all right. Why do you think that violated this rule? And at some point somebody says, you know what? I interpreted this word in a particular way. And the words are often vague enough that we can, that’s going to give us only limited form of transparency. It’s kind of like if a jury says, yes, we’re going to hold you liable because you were driving, unreasonably provide some transparency, but not a lot because it doesn’t tell you exactly what it was. That was unreasonable. So I do think there are difficulties for providing transparency. I do think it would be good to have more of that to the extent possible, but the one other problem is since a lot of these decisions are made and I know arrhythmic basis, often the algorithms are a pay and even to their own creatives, but presumably there are appeal mechanisms from the algorithms to some human decision makers who might then have to my identity required to offer something

Alex Moazed (39:26):
Under the current construct. Right. So go back to the telecoms, right? So if you are a user, let’s say you’ve been harmed by a telecom. You would, you don’t actually, I mean, you could, I guess, appeal to the telecom, but you could also appeal to the FCC and, and the, you know, I guess the government has bestowed that regulatory authority down to the FCC. And so, you know, what you, what you now have on your side as a, let’s say creator that has been harmed by a social media platform or kicked off, um, unfairly, you now are able to appeal to a government agency, government regulator, as opposed to, you know, having to appeal to the platform that kicked you off and has really no incentive or no, uh, impetus to respond or give you that transparency that we’re talking about. Right? So, you know, theoretically, that is the construct where once common carrier status is applied, either on a federal or a state level, you’re also then bestowing this kind of regulatory power to help protect the creators. You know, that, that is the reason why the common carrier status was put into effect in the first place. Is that some of how this would theoretically play out,

Eugene Volokh (40:47):
It’s very hard to tell. And that’s part of the problem, right? That all of these things have to be enforced by people. There’s always the question of who’s going to guard the guardians. Um, on one end you could say, well, let’s, let’s allow this to be, uh, done by basically ordinary litigation. This is the way a lot of rights that people have are enforced you consu. Uh, and you go to court and there’s a judge in their drawers. You know, judges screw up two jurors can be biased and all sorts of ways. Plus also it’s a very time consuming, expensive process. So we say, well, let’s bring in expert, uh, administrative agency personnel, like the FCC, which is both the kind of big point in money. Maybe also hearing examiners under the FCC, the patent trademark office, for example, uh, deals. Uh, it was always a lot of questions through these kinds of low levels.

Eugene Volokh (41:42):
So-called administrative law judges. Well, all right, that sounds more efficient and quicker. But on the other hand, FCC is a federal government body. Uh, it’s uh, members are, are appointed by political official, rather the president and they are then confirmed, uh, uh, by other political officials. And now their staffers might be career employees, but we may not trust them either, uh, because you might say, well, all right, there, they come from a very, very peculiar set of, uh, uh, kind of set set of people. They are, uh, generally speaking, the lawyers who have chosen to spend their career working for the federal government, they have their own biases. At least jury biases are a little bit more representative of public biases and a little bit less likely to be kind of in bread in ingrown, uh, because of they’re all kind of working for the federal government.

Eugene Volokh (42:41):
So you say, well, let’s not do the FCC. That would be a bad idea. Let’s go back again to courts. And then people said, well, wait a minute. So expensive to litigate. Eventually at some point, somebody throws me, throw up his hands and say, well, you know, we, maybe we can’t do any better. Sure. Then have the companies, or one possibility is to say, we should leave it to private practice, right. So we should leave it to private decision making. I really need to make sure that there really are a lots and lots of rivals companies out there. So that’s why some people say we should forget about all this common carrier treatment and try to ensure real, um, real competition. And because these, because of network effects, because the platforms are valuable precisely because they network people together, you can’t just break off Facebook into Facebook, but for a 100 new people, Facebook for another a hundred million Facebook for another a hundred million, because then how can they talk to each other?

Eugene Volokh (43:35):
Uh, so what you really need to do is you need to set up some communication infrastructure, some something that makes them open. It’s just like with telephone companies, right? We don’t have a telephone company where if you’re on Verizon, you can talk about, whereas in people but not, uh, but not the sprint people. And, and the like, uh, so, so maybe what you need to do is forget about all of this extra government regulation at the level of policing individual decisions, and maybe turn to structural regulation to make sure that Facebook and Twitter are set up or the, the, the, the underlying communication yeah. Protocols are available to everybody so that people can talk using a Facebook like system, but, but decide which Facebook, which Facebook, like company, they want to deal with it again, just like phone companies. We get to decide that without having to limit our, uh, our, uh, uh, the people we’re conversing with, to the people who are, uh, uh, subscribers to that company. So lots of people have lots of ideas about this. I’m not sure what the best one is, but there certainly are possible downsides to all of them, this a problem.

Alex Moazed (44:43):
Yeah. The interoperability that, you know, and I’m sure you’re familiar, there’s a, there’s a few bills, um, that I think have now officially left the house judiciary committee that are touching on a few of those different kinds of structural reforms that you’re alluding to. You know, my, my general problem with all of this is that if it’s legislation that needs to go through Congress and then, you know, multiple, uh, chambers of Congress, and then get signed into law, we’re talking about years, um, for any of this to be put in place. Similarly, you have Lena Khan just appointed to the head of the FTC. She has written papers talking about, you know, the big tech, um, needs to be reigned in and, and, and all of these things sound great. My concern about Lina Khan is that what she also says is that she needs to be bestowed the power to properly reign in big tech and kind of alludes to the fact that I guess she’s of the opinion, the FTC actually doesn’t have the authority.

Alex Moazed (45:48):
It needs to go after these big tech monopolies. Don’t know if you have an opinion there, but that also gives me hesitation to say, well, is Lina Khan and the FTC actually going to be able to get anything done in the next few years. But the thing that actually gives me the most optimism from what you’re talking about here is on the state level. And I think that’s the beautiful thing about our Republic is, uh, you know, the state federal relationship and how these states could move much faster to try out something like this. And you, you give the Florida example where, you know, this was a step in that direction without the nuance that you’re speaking to and without actually doing this summer under kind of the common carrier umbrella. But that was exactly in the similar line of thinking of what Florida started to, you know, has put into motion in a preliminary fashion. I’m sure there’ll be other iterations of it. So that actually gives me the most optimism is to kind of try this on the state level. And just in terms of getting something done before I get more gray and

Eugene Volokh (46:49):
White hair, you know, these are also all very important and difficult questions I have. How do you get things done promptly, but, well, uh, you don’t want the government moving too quickly. We have a system at, especially at the federal level, but also in each state have checks and balances for a reason. You don’t want somebody say, oh my God, disaster, we let’s enact something that completely restructures, uh, this particular, uh, this particular, uh, sector of the economy. And then it’s three months. And the result, the result might be again, especially likely to be worse than the alternative. And that’s particularly true when the country is split and that’s reflected in the split in the house, in the Senate

Alex Moazed (47:28):
Professor, there’s moving quickly and then there’s any movement at all. I mean, it’s 2021. So I feel like we’ve had that.

Eugene Volokh (47:37):
I’m not at all a believer in don’t just stand there, do something, uh, that’s the way that a lot of bad things get done. Sometimes it’s, don’t just do something stand. There is the better, is the better approach. I like the idea of, again, things being done at the state level, though, as you pointed out for the, for the very reason that it allows you to experiment a bit with both parts of it, the same time we have to acknowledge certain things can to be done. I think at the state level, because we don’t want to have a situation or one state sets the policy for the rest of the country. So if one state says Facebook should operate in a way that’s interoperable, and let’s say it’s one of the least popular states. Wyoming decides to say that the Facebook, like my only thing would be saying for the whole country, we now need to run this experiment in which Facebook, radically redesigned to the system that people throughout the whole country are using. On the other hand, again, I think that something that just says you can’t block Wyoming residents communications, especially with other Wyoming residents. So, uh, based on let’s say, or their viewpoint or some such, that’s something that is at least technically and legally, perhaps more localizable to particular particular stick.

Alex Moazed (48:47):
No. And then that’s good. It’s, it’s appropriate to come from the perspective you have. I think, you know, just like the conversation around section two 30, for example, right? If you were to eliminate those section two 30 protections for all tech companies, that would absolutely be a net negative, right? Because think about all the startups that don’t just have the resources to invest in these kinds of things. So if, if this common carrier language, for example, is done on a state level and, and specifically target targeted to these tech monopolies that are having the greatest level of impact and are making the greatest amount of, you know, harm and transgression with these very, you know, broad stroke decisions that they’re making on content censorship, then that makes me feel a lot more comfortable. The moment you start to make these rules apply much more broadly to all tech companies.

Alex Moazed (49:37):
Now, I actually think you’re doing overall more harm than you are good. And it’ll be interesting to see if a state does do this to any degree of, uh, uh, you know, of, of, of how much they, they try to put in impugn these tech monopolies. It would be interesting to see if the tech monopolies try to do what they did to Australia, which was when Australia put the screws on Google and Facebook to, Hey, the media companies more, fairly for Google news and Facebook news, Facebook and Google threatened to leave Australia altogether and Australia called their bluff. And, and they ended up, uh, not, um, leaving Australia, but came to the table and, and are now paying more money to, to the media companies. But I wouldn’t put it past the Googles and the Facebooks of the world to really up the ante aggressively say, Hey, Florida, if you’re gonna make us a common carrier we’re leaving or, or something to that effect, I’d be very curious to see how that plays out. But, uh, but yeah, like, like this direction that you’re talking about, you’re

Eugene Volokh (50:42):
Quite right. There’s just something, when you think about it, because I’m on other things in any political system, you can just sort of view some entity. They’re the ones who are writing, being regulated, and we’re the ones doing the regulating, right? They have something to say about it. They may say we don’t want to be regulated. So we’re going to maybe kick you the regulators out of office, or we’re going to threaten, uh, just refuse to do business in a state that looks to be bad for our business. So those are, those are real possible moves in the justice.

Alex Moazed (51:10):
So this has been a fantastic conversation and any, any parting thoughts or, or, uh, opinions we didn’t cover today on the topic. We’ve covered

Eugene Volokh (51:18):
A lot. I’m sure there’s a lot more left to cover, but I’ve very much enjoyed it. And thanks very much for having me.

Alex Moazed (51:23):
So what are the next steps for you? Your, this was a, this was a rough draft here. This one published July 5th, you’ve updated the title. You’re coming out with it with another version of this. How could we continue to follow your work? And, and your thinking on the, on the top,

Eugene Volokh (51:36):
I have a Twitter feed, that’s valance C for conspiracy, cause that’s the name of our blog for historical reasons, valid conspiracy, a little joke. It’s a bunch of law professors. Um, and so if you follow them all, like, see, you can see what we have to say in a lot of subjects. There’s a particular one on free speech issues called Alec speech, which is just the, to be on free speech issues. You can also search for Wallin. Conspiracy can go to all about common forward to reason magazine, which were, which is where we’re currently hosted the actual article that’s coming out in this new journal. But I helped start called journal of free speech law. It’s going to be, we’re hoping within a couple of weeks, first items are going to be published electronically and then print it maybe that month or so. Uh, and, uh, uh, that’s uh, but that’s mostly, a lot of the articles are mostly read by lawyers and law professors and occasionally by judges, uh, the, um, the stuff that we do again, hosted at reason that the Vela conspiracy is more aimed at the general. Okay,

Alex Moazed (52:33):
Well, professor is really a pleasure having you on we’ll. We’ll make sure to follow the upcoming revisions and publication here. And, uh, thank you so much for the time we hope to have you on again,

Eugene Volokh (52:44):
Very much. Look forward to it all the best.

Alex Moazed (52:46):
Well, that’s It for us today. And winner take all really a delightful conversation and, uh, we will talk to you soon.

 


Filed under: Winner Take All | Topics:

B2B Distribution Technology

Sign up for our weekly newsletter covering B2B technology innovation


Top Posts

  • B2B Chemical Marketplaces and Tech Startups: Landscape and State of the Industry

    Read more

  • Platform vs. Linear: Business Models 101

    Read more

  • Amazon Business – 2020 Report

    Read more

  • Platform Business Model – Definition | What is it? | Explanation

    Read more

  • The Value of Digital Transformation: How Investors Evaluate “Tech”

    Read more