If you missed the January 17 event, you can watch it here. Bottom line: Big Tech “is probably the single greatest threat to our ability to speak and to associate that we face right now … They control more than the government controls….”
On Sunday night, January 17, the Legal Insurrection Foundation held an Emergency Event: Surviving The Big Tech Purge.
The event was scheduled on short notice in light of the rapid deplatforming across Big Tech, particularly the takedown of Parler by Amazon Web Services, and the Google and Apple App Stores.
The panelists, in addition to me, were Amy Peikoff (Parler’s Chief Policy Officer), Michelle Ray (an tech security enthusiast), and Kemberlee Kaye (Legal Insurrection website’s Senior Contributing Editor, and the LIF’s Director of Operations and Editorial Development).
I usually pick a composite photo of panelists depending on which makes me look the best. Sorry, not sorry. Unfortunately, there weren’t any good ones of me, so I picked this one:
The main program was 50 minutes, plus another 35 minutes of Q&A. We had approximately 400 people attend online. The full program video is at the bottom of this post.
Embedded immediately below is a highlight reel excerpting 14 minutes from the program, as well as a transcript of the highlight reel. Please note that the highlight reel compacts clips of statements by the respective speakers and are not single uninterrupted presentations, it should be obvious where the break in the original comes (but if there’s any doubt, you can watch the full video at the bottom of the post).
Also, the transcript is mostly auto-generated, so there may be transcription errors.
HIGHLIGHT REEL VIDEO
Here is the highlight reel:
HIGHLIGHT REEL TRANSCRIPT
(Mostly auto-generated, may be transcription errors. Time stamps are approximate.)
Kemberlee Kaye, Legal Insurrection Foundation (00:09):
We’re simply here to discuss and advise you on matters of personal privacy, and only as it pertains to legal, lawful, constitutionally-protected speech. Everyone’s really concerned about…if we can’t use the big platforms that we’re used to using and have been funneled into over the last decade, where do we go? What do we use? Which ones are better? Which ones should we stay away from and why? And know that these are, again, extraordinary times. Usually I talk about how excited we are to host these events. I think the tone and tenor is a little bit different now, as we’re looking into another era. That said, though, be at peace and know that you are not alone. So, when you’re here, there’s tons of other people who were here who have the same concerns as you, and, together, we’re going to get through all of this. (…)
Professor William A. Jacobson, Cornell Law School (00:57):
We are going to try and talk about one of the most serious problems that, I think, we face as “non-liberals”. And I emphasize that it’s not just conservatives who are under fire. It’s anybody who is center or right of center [who] is really under fire. And I also want to reiterate what Kemberlee mentioned, that nothing that we’re talking about tonight is meant to protect people or to advise people on how to do anything that’s illegal. We are talking about protecting constitutionally-protected, free speech and freedom of association. (…) Collapses of societies happen slowly and then very suddenly. And I think we are in the “very suddenly” part, particularly as it pertains to internet, internet freedom, and freedom of speech, and the ability to speak freely. (…) And the shutdowns that took place on campuses really were the precursor to what is going on now. And the way they accomplished it was [through] expansive notions of hate speech.
Professor William A. Jacobson, Cornell Law School (02:05):
What is “hate speech”? Of course, there’s no constitutional thing [as] “hate speech”, but what they did was they established certain people who would not be allowed to speak, even if their speech was protected by law. They call them “Nazis” or “fascists” or whatever they want to call them. And then they’ve expanded it. And now everybody was a “fascist” or a “Nazi” or something else. They expand [these labels] to suit their political needs. And that’s how they would keep people off of campuses. And then it moved off campus. And we actually have a test case of this. And the test case of it was when we moved to Parler. In 10 years at Twitter, we could barely get 40,000 followers. In eight months at Parler, we had 400,000, and it became one of our major sources of traffic. (…) We’ve been throttled many times by Facebook.
Professor William A. Jacobson, Cornell Law School (02:55):
They eventually told us [restrictions have] been lifted. They still haven’t told me what it is we did wrong. And so that’s another example of this slow thing we’ve been seeing over time. (…) De-platforming has become worse, but it’s also been slow in coming. But, these models established themselves. (…) So you had the campus model of shouting people down, the expansive notion of “hate speech”, [and] calling everybody a “fascist” who shouldn’t be heard. (…) You have large internet platforms like Twitter and Facebook, which control so much of the political speech in the country. And then, you have the worst actors being models [having] developed against them. And those models are now being applied to a lot of different people, including Parler. (…) The difference now, I think, is that the “Big Tech” oligopoly isn’t reacting to outside pressure; they are initiating this on their own. And that is really one of the most worrying things. (…)
Professor William A. Jacobson, Cornell Law School (03:56):
I think it’s fair to say that “Big Tech” is probably the single greatest threat to our ability to speak and to associate that we face right now, because, if the government were to do that, it would clearly be a violation of the First Amendment. (…) We could sue them in court. We could sue them in federal court, but how are you going to sue Facebook? How are you going to sue Twitter? People have tried. It’s a real problem because they’re private companies. They control more than the government controls in many ways, but it’s harder to assert legal theories about them. And I do think that what’s happening now to Parler, which we’ll hear more about, really is the test case for the modern, complete de-platforming model that used to apply only to the worst of the worst of the worst. And now is being broadened [to]… more businesses, more websites. You’re going to have to have plan B and plan C, and maybe plan D, if you want to stay alive. (…)
Amy Peikoff, Chief Policy Officer, Parler (04:59):
So, thanks for having me here. At Parler, as you can imagine, we have been working steadily, almost around the clock; although we have started to actually make more time for sleep and things ’cause we’re in this for the long-haul. But, we are working steadily to get ourselves back online and better than before. (…) I do appreciate what you said, Professor, about the… issue of the content, this inciting and violent content that nobody wants on their platform. As I said from the beginning, I knew that content was everywhere. And I did think that Parler had been unfairly singled out [among the] other platforms and that, perhaps, on Facebook where the “Groups” feature existed, it was even more… the place where people could plan easily, the sorts of events that went on the 6th of January. So, I think now the truth has come out, but, of course, it’s too late, and we’re de-platformed, and we’re singled out. (…)
Amy Peikoff, Chief Policy Officer, Parler (06:05):
I mean, on Parler, for example, we’ve delivered a chronological feed, no algorithmic manipulation of content. So that’s perhaps why you got more engagement, while you would have more followers. We’ve heard those anecdotes a lot of times, that people who supposedly even had larger followings on Twitter would get more engagement in terms of comments and “echoes”, and things like that, on Parler. So that is one thing that we did offer…we’re going to when we get up again [offer] just a completely [un]-manipulated feed, no shadow banning, no…disparate application of guidelines or Terms of Service based on ideology. None of those types of things. (…) First thing is the legal immunity, right. Section 230 gives legal immunity not only for content that is generated by users, but it also provides legal immunity for the activities, the moderation activities, that the platforms do.
Amy Peikoff, Chief Policy Officer, Parler (07:16):
And I would put in that category, for Facebook or Twitter, any of these [companies], both [for the] content they’re removing and why, [their] decisions about content removal, and their decisions about any content manipulation, any engagement-enhancing algorithms that they have that make it so that they depart from a purely chronological feed, like the one that we have, in order to favor certain content and disfavor others, for whatever reason, right? It could be for engagement. It could be also because they want to suppress viewpoints with which they don’t agree. Of course, we don’t know what’s under the hood there, because all of this is something that you really can’t call into question. And I think, in large part, this is because of the Section 230 immunity. It’s very hard to try to find out even what’s under the hood or [to] hold them accountable for this type of behavior.
Amy Peikoff, Chief Policy Officer, Parler (08:11):
Now that’s the one thing…the legal immunity. The second thing is that, as we’ve seen, I’m calling them “hearanguings” (…) By the way, I’ve got a blog post over at my blog. The politicians are pressuring these companies to remove more, so-called “hate speech” from their platforms. And, at the same time, also remove or otherwise tag or put disclaimers on so-called “misinformation”. Those are the big categories that they talk about. And, of course, there might end up being others. But if you put together this legal immunity that is granted for the removal of content with the pressure from politicians to engage in more of [these removals], and, in particular, to restrict the flow of speech that would be protected by the First Amendment, [then] that is a very dangerous combination. And it’s one where… I’ve always restricted calling [the activities of these platforms “censorship”]. (…)
Amy Peikoff, Chief Policy Officer, Parler (09:15):
You can see that when you combine those factors, that you could think of it as “censorship” by proxy, that we’re actually engaging, [that is] we’re entering, into a “fascist” era where these companies are private. (…) Technically, supposedly, they’re private, but there’s such a relationship. And the companies have become dependent on the immunity that’s granted to them to preserve the pillars of their monetization model, meaning those engagement algorithms. And, in addition, any sorts of data that they are mining and gathering and profiling and things like that, all of that is protected because of the, the [Section 230] immunity and their cozy relationship with the government. (…) To these platforms, [they must] moderate more [or else] they’re not going to have their Section 230 immunity unless they are doing the government’s bidding. And of course, this is all necessary in order for “public safety”, et cetera. (…) If you talk about a future in which we’re going to have an enhanced Section 230 along the lines that Mark Zuckerberg is hoping for, with regulation that’s going to require the removal of more and more content, recordkeeping, and potential invasions of privacy, take that scenario versus a repeal of Section 230, which might result in some litigation fees, at the beginning, and be kind of messy, but then common law judges, smart judges dealing in the real world, can have a chance to work these cases out and be rational about it and hold platforms to a normal standard standard of care. We actually don’t think it would mean the end of social media platforms on the Internet. It need not. So that’s our thinking in a nutshell.
Michelle Ray, Tech Security Enthusiast (11:01):
[I come from a] completely different place than almost everyone else on the panel. (…) And the idea for me that what is happening right now, particularly in technology, is something that we need to focus our energy on fighting, is actually, in my opinion, a waste of energy. Things have always evolved. People have always innovated to overcome, and while there’s things [that] are happening much faster now, [the solutions] are out there. The solutions are out there. And so, I tend to caution people that what you’re using right now, as a platform, as a means of communication, as a means of streaming television, as a means of storing your most precious memories, are not [the] permanent or the only solutions. And those things innovate daily and rapidly. And sometimes for the better and sometimes for the worse. (…) And I get a lot of pushback, particularly from the center and right-leaning community, about taking the fight to the left and to liberals.
Michelle Ray, Tech Security Enthusiast (12:05):
And that, you know, the right and center-right shouldn’t need to abandon these platforms, that their voices aren’t being heard; they’re being censored. And that may well be, but that doesn’t leave us without opportunities. And those opportunities include taking our audiences with us. One of the things that should be very, very clear to people who have been using social media for the last 10 years is that you’re the product of these social media companies. You’re fed advertisements all day long. Your data is collected and sold. You have no privacy expectations whatsoever on any of these platforms. And while the convenience has outweighed those perils for people, until now, I think a lot of people are starting to see that. And rather than taking the fight to these platforms and spending the time continuing to be their product, despite the fact that, that they are censoring your ideas, my suggestion has always been [to] use them to the best of your ability while you build an audience, while you recommend that your audience move elsewhere. (…) Before I mention any platforms or solutions specifically, I do want to say that many of these are new. They are evolving and innovating as the need for them happens. (…) I would say the very first thing I would suggest you do is make a list of the services you definitely think you need in your life. Because, as Kemberlee mentioned before, the more services you’re using, the more information about yourself you’re putting out there and the more you are inviting these platforms into your personal life and your private information.
Donations tax deductible
to the full extent allowed by law.