Image 01 Image 03

Turley: GoFundMe Canceling Candace Owens Could Affect Its Immunity ‘Under Section 230’

Turley: GoFundMe Canceling Candace Owens Could Affect Its Immunity ‘Under Section 230’

“Many other[s] have expressed equally controversial opinions about police officers, Trump, and others. Will they all be now banned from raising charitable donations?”

https://www.youtube.com/watch?v=rHqPm1egYu4

Professor Jonathan Turley wrote that GoFundMe canceling Candace Owens’ fundraiser for an Alabama cafe “could undermine the position” of the company “against the loss of immunity under Section 230 of the Communications Decency Act.”

Owens, who criticized George Floyd, started a fundraiser for Parkside Cafe owner Michael Dykes after people boycotted him for calling Floyd a thug and described protesters as idiots.

Owens came under fire for saying “Floyd was not a good person” and sickened “that he has been held up as a martyr.”

GoFundMe said it took down the fundraiser “because of a violation of our Terms of Service, specifically our prohibition on ‘user content that we deem, in our sole discretion, to be in support of hate, violence, harassment, bullying, discrimination, terrorism, or intolerance of any kind.'”

We heard Section 230 come up after Twitter fact-checked a tweet by President Donald Trump.

What is Section 230? Turley explained on June 1:

The heart of the executive order is Section 230 of the Communications Decency Act. The 1996 legislation signed into law by President Bill Clinton was largely an attempt to regulate pornography and struck down in significant part as unconstitutional. Section 230, however, survived and grants any “interactive computer service” (including Internet and social media companies) immunity from most lawsuits over content posted by users. Courts have interpreted the provision to give sweeping immunity for companies like Twitter and Facebook because they simply supply a forum for others to express themselves.

Turley explains the complications with 230:

The immunity under Section 230 of the Communications Decency Act is based on the theory that these sites are not responsible for content. Courts have interpreted the provision to give sweeping immunity for companies like Twitter and Facebook because they simply supply a forum for others to express themselves. These cites are now actively engaged in forms of private censorship.

And those who have expressed similar thoughts about people on the right?

Again, my interest is not in the content of these comments but the role of previously neutral forums to engage in content based private censorship. Both the owner and Owens were expressing their views of Floyd. Many other[s] have expressed equally controversial opinions about police officers, Trump, and others. Will they all be now banned from raising charitable donations?

Turley is correct when he says that we can expect to see even more demands to remove fundraisers “by those who want to silence people with opposing views.”

You reap what you sow.

DONATE

Donations tax deductible
to the full extent allowed by law.

Comments

2smartforlibs | June 10, 2020 at 12:42 pm

If you are not aware of how crooked gofundme is. YOU are now.

    notamemberofanyorganizedpolicital in reply to 2smartforlibs. | June 10, 2020 at 3:46 pm

    Bet they got GOFme for these though.

    Sesame Street Teaching Children America Is A Racist Country…

    Homeless People Ate All The Food In The
    Capitol Hill Autonomous Zone, Situation Dire

    Ocasio-Cortez Stumped When Confronted On Studies Showing Defunding Police Leads To More Crime

    Weazil Zippers

Scott Adams made the connection that governing by “empathy” instead of rules inevitability leads to government by who complains the most effectively. Here we are seeing the fruits of that, as more and more frequently, people win their arguments by complaining louder, harder and more vigorously as the only way to win their case.

So where does it go from here I wonder…

These web sites cannot have it both ways.

Either they are protected by law from lawsuits derived from content on their sites, in which case they have no reason to censor anything.

Or they censor content they do not like and so become editors and publishers responsible for the content on their sites. The act of censoring content makes one legally liable for what is not censored. They are assuming responsibility for the content on their sites.

Here’s hoping for a Trump relection and a GOP takeover of the House.

    DaveGinOly in reply to JHogan. | June 10, 2020 at 1:13 pm

    It’s as if they believe a woman can be “half pregnant.”

    Milhouse in reply to JHogan. | June 10, 2020 at 2:21 pm

    They can have it both ways, because the law explicitly allows them to.

    Either they are protected by law from lawsuits derived from content on their sites, in which case they have no reason to censor anything.

    They are protected from lawsuits based on content provided by others, and that is not the reason they are censoring it. Their reason for censoring that content is that they find it offensive — the same reason the LI moderators censor content they find offensive. That is the express purpose of Section 230.

    Or they censor content they do not like and so become editors and publishers responsible for the content on their sites. The act of censoring content makes one legally liable for what is not censored.

    That is not true. § 230 explicitly says the opposite. That was the whole point of § 230 in the first place. The horrible Prodigy case in the NY Supreme Court said that by removing some user-supplied content Prodigy became the publisher of whatever it didn’t remove. And that would have utterly destroyed the possibility of forums like the one we’re using right now. So in order to prevent that Congress passed § 230, explicitly allowing interactive computer services to delete offensive content without thereby becoming publishers.

    Make no mistake, if § is repealed then Prof J will have no choice but to close down this forum. He can’t leave it unmoderated or it will become a sewer, but he can’t possibly catch every offensive comment the moment it’s posted. Even pre-moderation wouldn’t be enough because it would require the moderators to carefully read each comment before approving it, and play it safe by deleting any even slightly questionable. Anything potentially defamatory of non-public figures would have to go, because the moderators can’t possibly be expected to investigate the truth of such material.

      Vladtheimp in reply to Milhouse. | June 10, 2020 at 8:03 pm

      Sorry, Section 239 states:

      “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
      (2) Civil liability No provider or user of an interactive computer service shall be held liable on account of—
      (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
      (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1]”

      Restricting “access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable” is completely different from adding content that characterizes material as allegedly false, misleading, et al.

      Professor Turley is right – These forums can ban comments, but not give their political, religious, social opinions on the content – at that time they are adding content and not an ’empty vessel.’

      I’m sure Professor Jacobsen knows he is in no jeopardy.

        Milhouse in reply to Vladtheimp. | June 11, 2020 at 12:30 am

        Professor Turley is right – These forums can ban comments, but not give their political, religious, social opinions on the content – at that time they are adding content and not an ’empty vessel.’

        1. We are not talking about a provider adding content. That is not the subject of this post, or of Turley’s article. Gofundme did not add one word to Owens’s fundraiser; on the contrary, it canceled it.

        2. When they do add content, of course they are responsible for that content. Nobody has ever suggested they aren’t. But § 230 makes them not responsible for content they did not provide and may not even have seen. And it does so even if they remove user content that they have seen and find objectionable.

Connivin Caniff | June 10, 2020 at 1:23 pm

So what if they lose their “immunity”? Can’t the website just say that the opinions expressed in the posts are those of the posters alone and not necessarily those of the website? And that the website reserves the right to reject all posts, for whatever reason the website may have?

    Milhouse in reply to Connivin Caniff. | June 10, 2020 at 2:25 pm

    No. If § 230 is repealed then under Prodigy the moment a site deleted one user-supplied comment it would become the publisher of whatever it doesn’t delete, even if it never saw it.

    Prodigy is only a NY state case, so it’s not binding anywhere else, but nobody would take the chance that their own state’s courts, or US courts, might agree with it. Only a Supreme Court decision overturning it would make providers feel safe again. I think such a decision is likely, but it could take years, during which time the whole universe of user-provided content would be destroyed.

TheOldZombie | June 10, 2020 at 1:33 pm

The problem with 230 is no one in the government is enforcing it in the judicial branch. They either don’t understand it or they are actually on the side of the companies because of ideology.

    CorkyAgain in reply to TheOldZombie. | June 10, 2020 at 1:59 pm

    What is there to enforce? What crime is being committed, i.e., what statute is being violated?

    I’m not a lawyer, but if I understand correctly, the problem is that the courts have interpreted 230 as granting immunity to these companies. So unless and until that section is repealed or appropriately revised by the legislature, no lawsuit against them will be allowed to continue.

    It’s Congress that needs to act on this, not the DOJ.

You reap what you sow.

So far they’re not reaping anything—they’re getting away with it. All of it. Every bit of underhanded bigotry and perverted chicanery. And until somebody gets hammered for it, and good, they’ll continue to get away with it.

broomhandle | June 10, 2020 at 1:54 pm

National Review has a good article on Section 230. Worth reading to understand this better.

The immunity under Section 230 of the Communications Decency Act is based on the theory that these sites are not responsible for content.

That’s not a theory, it’s the section’s explicit purpose. The whole point was to overturn the Prodigy case. Maybe you weren’t around then and don’t remember what a shock it sent through the whole internet.

Courts have interpreted the provision to give sweeping immunity for companies like Twitter and Facebook because they simply supply a forum for others to express themselves.

That’s not the courts’ interpretation, that’s the law. It’s what Congress said and meant.

And there was never any requirement, explicit or implicit, of political neutrality. Those making such a claim are not telling the truth. While it was being lobbied for and passed, nobody ever said anything about such a requirement. The law explicitly allows providers to delete offensive content, and who could possibly decide what is offensive except the provider himself?

    daniel_ream in reply to Milhouse. | June 10, 2020 at 2:50 pm

    Maybe you weren’t around then and don’t remember what a shock it sent through the whole internet.

    I was.

    It’s gotten to the point where I have the text of S.230(c) bound to a keyboard macro for cutting and pasting into forum posts. It saves typing.

    […] political neutrality. Those making such a claim are not telling the truth.

    The funny thing is, the intent of S.230 was to allow sites to delete porn.

      I go cross eyed reading legal text.

      American philosophical theory presupposes that when two opposing parties clash that both behave according to a mutually agreed upon social compact; that both operate as honest and just actors.

      What happens when the social compact is voided?

      In pursuing a dry and letter of the law interpretation Milhouse unwittingly disregards our social compact.

        Milhouse in reply to Tiki. | June 10, 2020 at 5:26 pm

        What the hell is that supposed to mean? Do you even know? THERE IS NOT AND HAS NEVER BEEN ANY “SOCIAL COMPACT” to ignore what the law actually says and instead just make stuff up because of your feelingz. And if you think there is you don’t belong on this blog.

          Tiki in reply to Milhouse. | June 10, 2020 at 8:22 pm

          So, just to be certain, you’re not a fan of the social compact thingy?

          We’ll go to the ovens, but damnit, we did it by the book!

          Milhouse in reply to Milhouse. | June 11, 2020 at 12:31 am

          I’m saying there is no such compact and never has been. And that making up laws based on the judge’s whims and feelingz is precisely what this blog is against!

        daniel_ream in reply to Tiki. | June 11, 2020 at 12:45 am

        Dude, it’s literally one sentence:

        (1) Treatment of publisher or speaker

        No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

        That’s it. It’s about as clear and unambiguous as “…shall not be infringed.”

        For the love of St. Michael the Archangel, learn how the Internet works. Four web sites are not “the Internet”.

This latest is another example of supposedly well intentioned regulations abused by malicious actors to disenfranchise a individual or minority groupings of people.

Checkmated by default.

    Milhouse in reply to Tiki. | June 10, 2020 at 5:28 pm

    The regulation is not being abused. Gofundme is using it exactly as it was intended. It doesn’t read all user-supplied content, but when offensive content comes to its attention it deletes it. That is what the law is for. Who is to decide what is offensive, if not the site owner?

There is a GoFundMe account titled F*** GoFundMe, calling discrimination against Candice Owens. Posted by Jim Kelley.

Google “alternatives to gofundme”.

    n.n in reply to gibbie. | June 10, 2020 at 3:45 pm

    Exactly. Let the market (“democracy”) judge the warlock judges (e.g. protestors) and their backers (e.g. GoFundMe).

“Could”. Enough of “could”.

AG Barr muttered something about this, then went back to bed.