Image 01 Image 03

Report: Documents Reveal Facebook Exempts a ‘Secret Elite’ From the Rules

Report: Documents Reveal Facebook Exempts a ‘Secret Elite’ From the Rules

“High-profile accounts posed greater risks than regular ones, researchers noted, yet were the least policed.”

Facebook preaches free speech. Facebook insists it has no favorites, nor does it censor non-leftists.

The Wall Street Journal revealed documents showing Facebook exempts some high-profile users from the rules.

This line in the article perfectly sums up the situation: “High-profile accounts posed greater risks than regular ones, researchers noted, yet were the least policed.”

In private, the company has built a system that has exempted high-profile users from some or all of its rules, according to company documents reviewed by The Wall Street Journal.

The program, known as “cross check” or “XCheck,” was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists. Today, it shields millions of VIP users from the company’s normal enforcement process, the documents show. Some users are “whitelisted”—rendered immune from enforcement actions—while others are allowed to post rule-violating material pending Facebook employee reviews that often never come.

At times, the documents show, XCheck has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users. In 2019, it allowed international soccer star Neymar to show nude photos of a woman, who had accused him of rape, to tens of millions of his fans before the content was removed by Facebook. Whitelisted accounts shared inflammatory claims that Facebook’s fact checkers deemed false, including that vaccines are deadly, that Hillary Clinton had covered up “pedophile rings,” and that then-President Donald Trump had called all refugees seeking asylum “animals,” according to the documents.

A 2019 internal review of Facebook’s whitelisting practices, marked attorney-client privileged, found favoritism to those users to be both widespread and “not publicly defensible.”

A confidential source admitted, “We are not actually doing what we say we do publicly.”

The source described Facebook’s actions as “a breach of trust.”

“Unlike the rest of our community, these people can violate our standards without any consequences,” the source added.

Someone introduce this source to YouTube and Twitter.

Anyway, XCheck has at least 5.8 million people in 2020. Facebook has “invisible elite tiers within the social network” because it had a hard time “to accurately moderate a torrent of content and avoid negative attention.”

Facebook even misled its own Oversight Board, which it “created to ensure the accountability of the company’s enforcement systems.”

WSJ said the documents show that those at Facebook know “in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.”

It seems everyone in the company knows, including CEO Mark Zuckerberg:

Sometimes the company held back for fear of hurting its business. In other cases, Facebook made changes that backfired. Even Mr. Zuckerberg’s pet initiatives have been thwarted by his own systems and algorithms.

The documents include research reports, online employee discussions and drafts of presentations to senior management, including Mr. Zuckerberg. They aren’t the result of idle grumbling, but rather the formal work of teams whose job was to examine the social network and figure out how it could improve.

They offer perhaps the clearest picture thus far of how broadly Facebook’s problems are known inside the company, up to the CEO himself. And when Facebook speaks publicly about many of these issues, to lawmakers, regulators and, in the case of XCheck, its own Oversight Board, it often provides misleading or partial answers, masking how much it knows.

Special Treatment

I have a few friends who regularly get muted or suspended from Facebook. We often wonder why or how. Now we know:

Sometimes the company’s automated systems summarily delete or bury content suspected of rule violations without a human review. At other times, material flagged by those systems or by users is assessed by content moderators employed by outside companies.

Mr. Zuckerberg estimated in 2018 that Facebook gets 10% of its content removal decisions wrong, and, depending on the enforcement action taken, users might never be told what rule they violated or be given a chance to appeal.

But if you are whitelisted, none of this applies to you. Qualifications include “newsworthy,” “influential or popular,” and “PR risky.”

WSJ brought up Brazillian soccer star Neymar da Silva Santos Jr.:

Neymar, the Brazilian soccer star whose full name is Neymar da Silva Santos Jr., easily qualified. With more than 150 million followers, Neymar’s account on Instagram, which is owned by Facebook, is one of the most popular in the world.

After a woman accused Neymar of rape in 2019, he posted Facebook and Instagram videos defending himself—and showing viewers his WhatsApp correspondence with his accuser, which included her name and nude photos of her. He accused the woman of extorting him.

Facebook’s standard procedure for handling the posting of “nonconsensual intimate imagery” is simple: Delete it. But Neymar was protected by XCheck.

For more than a day, the system blocked Facebook’s moderators from removing the video. An internal review of the incident found that 56 million Facebook and Instagram users saw what Facebook described in a separate document as “revenge porn,” exposing the woman to what an employee referred to in the review as abuse from other users.

“This included the video being reposted more than 6,000 times, bullying and harassment about her character,” the review found.

Facebook’s operational guidelines stipulate that not only should unauthorized nude photos be deleted, but that people who post them should have their accounts deleted.

“After escalating the case to leadership,” the review said, “we decided to leave Neymar’s accounts active, a departure from our usual ‘one strike’ profile disable policy.”

Neymar did not face any charges. The authorities charged the woman with slander, extortion, and fraud. They dropped the first two. She was acquitted of fraud.

What Counts?

Zuckerberg does not want Facebook “to become the arbiters of truth.”

But yet important people get special treatment.

Facebook listens and notices when you complain about them moderating simple messages. But when its human and automated systems flag someone prominent it becomes a mess:

Last year, Facebook’s algorithms misinterpreted a years-old post from Hosam El Sokkari, an independent journalist who once headed the BBC’s Arabic News service, according to a September 2020 “incident review” by the company.

In the post, he condemned Osama bin Laden, but Facebook’s algorithms misinterpreted the post as supporting the terrorist, which would have violated the platform’s rules. Human reviewers erroneously concurred with the automated decision and denied Mr. El Sokkari’s appeal.

As a result, Mr. El Sokkari’s account was blocked from broadcasting a live video shortly before a scheduled public appearance. In response, he denounced Facebook on Twitter and the company’s own platform in posts that received hundreds of thousands of views.

Facebook swiftly reversed itself, but shortly afterward mistakenly took down more of Mr. El Sokkari’s posts criticizing conservative Muslim figures.

Mr. El Sokkari responded: “Facebook Arabic support team has obviously been infiltrated by extremists,” he tweeted, an assertion that prompted more scrambling inside Facebook.

After seeking input from 41 employees, Facebook said in a report about the incident that XCheck remained too often “reactive and demand-driven.” The report concluded that XCheck should be expanded further to include prominent independent journalists such as Mr. El Sokkari, to avoid future public-relations black eyes.

Whitelisters Posting Misinformation

So how about when whitelisted people share misinformation? This is the best line in the article: “High-profile accounts posed greater risks than regular ones, researchers noted, yet were the least policed.”

It has happened:

In one instance, political whitelist users were sharing articles from alternative-medicine websites claiming that a Berkeley, Calif., doctor had revealed that chemotherapy doesn’t work 97% of the time. Fact-checking organizations have debunked the claims, noting that the science is misrepresented and that the doctor cited in the article died in 1978.

Will It End?

Facebook spokesman Andy Stone called XCheck’s criticism fair, but justified it while insisting Facebook is trying to phase it out:

In a written statement, Facebook spokesman Andy Stone said criticism of XCheck was fair, but added that the system “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.”

He said Facebook has been accurate in its communications to the board and that the company is continuing to work to phase out the practice of whitelisting. “A lot of this internal material is outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross check and has been working to address them,” he said.

A product manager on Facebook’s Mistakes Prevention Team admitted it keeps adding to the VIP lists. She has a plan to “stop the bleeding” like “blocking Facebook employees’ ability to enroll new users in XCheck.

I also like this idea:

One potential solution remains off the table: holding high-profile users to the same standards as everyone else.

“We do not have systems built out to do that extra diligence for all integrity actions that can occur for a VIP,” her memo said. To avoid making mistakes that might anger influential users, she noted, Facebook would instruct reviewers to take a gentle approach.

“We will index to assuming good intent in our review flows and lean into ‘innocent until proven guilty,’ ” she wrote.


Donations tax deductible
to the full extent allowed by law.


Omgz. No way.

XCheck pigs are more equal than others, but they’re still pigs.

Interesting. Doesn’t sound like acting in good faith to me. In fact, sounds remarkably like acting in bad faith. Would love to see this cited in a motion in President Trump’s litigation. Let’s see how their section 230(c)(2) immunity holds up in court now.

because of course they do…

Big Tech is evil

Because, of course they do.

Two sets of rules. One for the favored pure and true elites. And one for the peasants and their filthy despicable leaders.

In this Facebook is only emulating the federal government, AKA the Deep State and the Swamp. And the Dem-media complex.

Anyone think that if the 2020 election was flipped, that if Trump was certified the winner, and Biden and his supporters screamed the election was stolen, which was almost certainly their plan, their claim would be dismissed by the ruling elites — the courts, the corporate media, the federal bureaucracy?

Not to mention that’s exactly what Hillary and her supporters claimed in 2016. We saw what that led to.

In 2020 the rioting of the summer would have been only a dress rehearsal for a real Dem led post-election ‘insurrection’.

That’s where we are. The Dems have dragged the nation into a cold civil war out of desperation and fear that the gains from their long march through American institutions of the last 50 years might slip away as the people rise up in opposition.

The One pre-emptively adopted the strategy of labelling his political opposition ‘white supremacists’, racists, and bigots ‘on the wrong side of history’. The Dems and their propaganda media allies have been running with it ever since.

“Facebook’s problems are known inside the company, up to the CEO himself. And when Facebook speaks publicly about many of these issues, to lawmakers, regulators and, in the case of XCheck, its own Oversight Board, it often provides misleading or partial answers, masking how much it knows.”

Wow. The more things change, eh?

On October 18, 1995, Thomas A. Busey, then Chief of the National Firearms Act Branch of the Bureau of Alcohol, Tobacco and Firearms (hereafter “BATF”) made a videotaped training presentation to BATF Headquarters personnel…
Busey began his roll call presentation by acknowledging that “Our first and main responsibility is to make accurate entries and to maintain accuracy of the NFRTR [federal registry of legal civilian owners of machine guns and other destructive devices]…” Moments later Busey makes the astonishing statement that:

… when we testify in court, we testify that the data base is 100 percent accurate. That’s what we testify to, and we will always testify to that. As you probably well know, that may not be 100 percent true.

Busey then goes on for several minutes describing the types of errors which creep into the NFR&TR and then repeats his damning admission:

So the information on the 728,000 weapons that are in the data base has to be 100 percent accurate. Like I told you before, we testify in court and, of course, our certifications testify to that, too, when we’re not physically there to testify, that we are 100 percent accurate.

How bad was the error rate in the NFR&TR? Busey again:

… when I first came in a year ago, our error rate was between 49 and 50 percent, so you can imagine what the accuracy of the NFRTR could be, if your error rate’s 49 to 50 percent.


But hey, it’s just 10 years minimum for the poor sap whose registration they lose. Close enough for government work. Or now, Fakebook.

So what do the lawyers on this site say about how this squares (or not) with 230 and its authorities?

    daniel_ream in reply to rduke007. | September 13, 2021 at 10:48 pm

    Please stop obsessing over S.230. You don’t have a right to use someone else’s computer for free, and immunity from liability for removal of user-submitted content is the only thing keeping the Internet free and functional. If you want to argue that the government should have the right to tell random web sites what they can and can’t allow to be posted, I invite you to move to Canada where the inevitable endgame of that stupid position is now playing out.

      There is a very powerful rejoinder to your argument. The Constitution protects the right of freedom of association. But if somebody runs a place of public accommodation, for example a restaurant open to all, state laws provide that the person cannot make arbitrary decisions to exclude some people rather than others. The good faith requirement in section 230 is a similar limit on the ability of big tech to decide who does or does not have access: they can use objective & neutral criteria but cannot base the decisions on partisan bias. Draw your own conclusions about what they’re doing.

        henrybowman in reply to RRRR. | September 14, 2021 at 5:40 pm

        Of course, without the “interpretation” of black-robed government agents with an agenda, there is no more constitutional justification for “you are a place of public accommodation” than there is for “we did it for the children.” It was an earlier penumbra and emanation, and not the first, either. Your constitutional rights don’t disappear just because you make market exchanges with someone else.

      DaveGinOly in reply to daniel_ream. | September 14, 2021 at 11:16 pm

      Check out Marsh v Alabama. The Cliff Notes version is that a woman who lived in a literal company (owned) town was prevented by the company from handing out religious literature. She sued. Successfully. SCOTUS said the more a company extends its private private property to use by the public (in this case, its employees who were literally living on private property), the fewer restrictions it is able to enforce over the public’s exercise of its rights.

      Facebook’s platform is private, but it was built to be a platform used by the public, each to be able to express their thoughts and to communicate with others. According to Marsh, this means Facebook has surrendered its control over the rights of the public.

      From Wikipedia’s page concerning Marsh:
      The state had attempted to analogize the town’s rights to the rights of homeowners to regulate the conduct of guests in their home. The Court rejected that contention by noting that ownership “does not always mean absolute dominion”. The court pointed out that the more an owner opens his property up to the public in general, the more his rights are circumscribed by the statutory and constitutional rights of those who are invited in.

Zuckerberg has testified to congress several times. This makes it sound like he has perjured himself.

Will he be referred for prosecution? Probably not.

If referred, will Merrick Garland do anything about it? I assume Merrick Garland is one of the “elites.” Anyway, he’s a lousy AG so I don’t expect anything.

    theduchessofkitty in reply to irv. | September 14, 2021 at 2:34 am

    Forget the DOJ or the courts.

    Better to use the nugget of wisdom Cornelius Vanderbilt threw to two men who scammed him out of a heavy investment: “I’m not going to sue you. The courts take too long. I’m going to RUIN YOU!”

    He did.

theduchessofkitty | September 14, 2021 at 2:31 am

“All animals are equal, but some animals are more equal than others.” – George Orwell, Animal Farm

Never thought FB would have this above quote as a basis for company policy, but here we are.

Not shocked at all, in fact would be surprised if the Marxists didn’t do that.

Do I understand this correctly? There is written testimony and documental evidence that Mr. Zuckerberg lied to Congress on a multitude of occasions? Will he be subject to a FBI SWAT raid to confiscate all his electronic devices at home and in his offices to secure the evidence? And be tried as if he were, oh, say, a Mr. Stone or a normal American citizen not favored by the Asterisk regime?

    henrybowman in reply to felixrigidus. | September 14, 2021 at 5:42 pm

    Grassley and Issa investigated Fast and Furious for over two years.
    Eric Holder was called in front of Congress to answer for it. He refused.
    He was cited for contempt of Congress. And then absolutely nothing.
    Expect the same for any other Democrat.

IMO, what this reveals is the sheer impossibility of FB to monitor and police the content uploaded in anything close to real time. Algorithms are ok for initial triage but are limited. Boolean logic isn’t adaptable, the programs do what their human creators instruct in a binary way without any nuance.

Sec 230 was written precisely because the tech social media companies were and remain unable to police content. The problem most people have is that these companies have been putting their thumb on the scale to address content they disagree with.

It seems an easy fix to me. Either apply an equal adjudication process under transparent and equal rules for every content provider or retain retain Sec 230 protection.

Who decides what ‘elite’ is? The same folks that figure out what controversial is? How can someone be part of the deciders? How about a test, like Mensa? I may qualify, as an American of Polish decent, I belong to Densa. I mean, why not?