Image 01 Image 03

REPORT: Steven Crowder Claims YouTube is Silencing Tulsi Gabbard

REPORT: Steven Crowder Claims YouTube is Silencing Tulsi Gabbard

Up until now, YouTube has managed to keep all such accusations from getting out of control by blaming low level programmers and the site’s algorithms for any such inequities.

https://www.youtube.com/watch?v=YOByUDv1ftQ

A new experiment performed by conservative comedian Steven Crowder alleges that YouTube isn’t merely lying but that they’re purposely going out of their way to curate search results within the United States to a degree that could be considered election tampering.

In a livestream posted Tuesday, the comedian explained his process for determining whether his own videos would appear in the top of search results. By having his researcher experiment with different nationalities in their IP address, changed by using a VPN, they discovered that different countries would receive different search results when you type in searches like “Change My Mind” or “Tulsi Gabbard”. In every other country except the United States, Crowder and Gabbard’s channels appeared at the top of the search.

As of 2019, YouTube is the only mainstream video sharing platform with the reach necessary to have a major effect on the public discussion. When websites like it curate edgy political content or go as far as to limit a politician currently running for president it sets a disturbing standard that puts them in control of the public discussion.

The livestream sparked a hashtag on twitter #CrowderExposesYouTube where prominent voices from across the political spectrum called out YouTube for what Crowder alleges is politically motivated content curation.At the time of writing, neither the Gabbard Campaign nor YouTube have released as statement addressing Crowder’s claim.

Rumors of shadow banning and curating search results have circled around conservative circles on YouTube for years. Prominent conservatives and libertarians like Lauren Chen and Matt Christiansen have alluded to the fact that their analytics suggest that their channel growth has been slowed over time.

Dennis Prager’s channel Prager U has even been in an on and off lawsuit with YouTube for several years now attempting to seek retribution for the fact that their educational videos have been age-restricted by YouTube.

Allegations like these have been made by numerous people on both the right and the left who claim that their content is being slowly throttled and removed from suggested videos and search results over time to slowly stop the growth of channels YouTube doesn’t like. The claims even extend to far-left channels on YouTube in LGBTQ+ and far-left circles where claims of censorship against videos discussing sexuality and bigotry get age-restricted or flagged.

Up until now, YouTube has managed to keep all such accusations from getting out of control by blaming low level programmers and the site’s algorithms for any such inequities.

DONATE

Donations tax deductible
to the full extent allowed by law.

Comments

Even if you saw the algorithms they probably would not make much sense, in that it’s machine learning and that disappears into eigenvalues and eigenvectors and worse.

You can adopt or disadopt a particular learning instance and get control that way, but the algorithm itself does not explicitly show what sort of results it favors.

It’s about equally likely that it’s confirmation bias and biased editing. The history of the code adoption might show something, but the code won’t.

    notamemberofanyorganizedpolicital in reply to rhhardin. | October 26, 2019 at 1:01 pm

    Yes.

    Algorithm formulas have to be written by people (and most of those tech people writing them are Democrats and Leftists).

    So there is no doubt their personal preferences are coming into play, if not into PAY…….

I rather suspect that it’s neural networks of the wet-ware variety.

No, not even close. Doing identical searches to the same platform using the same search parameters and the only difference is the source IP address yielding very different results? Sorry, I’m finding it very hard to swallow that it’s all a function of machine learning.

If the searches were similar but different I could see confirmation bias. If they are the same, they should yield the same results. They don’t.

I also don’t buy that some low level programmer made some code that nobody bothered to check that results in skewed search returns. If they did, YouTube would have fixed the issue as soon as it was brought to their attention. They haven’t, which leads me to believe it’s intentional.

Add to that that there is an ability for a human to restore or delete content from the platform and that they’ve been playing games with Steven Crowder’s content without being able to explain why they’re jerking him around? No, totally intentional.

    assemblerhead in reply to vinnymeyer. | October 25, 2019 at 8:28 pm

    It’s only a matter of time till they start editing / sabotaging anything uploaded by anyone they don’t like.

    They need to be striped of Section 230 protection.

Machine learning that “Orange man bad!”

Of course they are

YouTube has managed to keep all such accusations from getting out of control by blaming low level programmers

I don’t recall “it’s not our fault, our employees did it” being a defense in modern law. Your employees are presumed to have acted under your direction, especially when what they did is a spot-on match with their job description.

There was another round of account terminations last week. I’m surprised Lauren, Matt, and Stefan Molyneux haven’t been banned yet. I’m also surprised Brittany Sellner hasn’t had her account terminated for association with her husband, since his English-language account is gone. Lauren Southern quit before she could be banned, I assume she’d be insta-banned if she came back.

    notamemberofanyorganizedpolicital in reply to randian. | October 26, 2019 at 1:03 pm

    Remember when YouTube pulled the Legal Insurrection YouTube
    Channel?

    (Someone correct me on that if I’m wrong.)