Report: Documents Reveal Facebook Exempts a ‘Secret Elite’ From the Rules

Facebook preaches free speech. Facebook insists it has no favorites, nor does it censor non-leftists.

The Wall Street Journal revealed documents showing Facebook exempts some high-profile users from the rules.

This line in the article perfectly sums up the situation: “High-profile accounts posed greater risks than regular ones, researchers noted, yet were the least policed.”

In private, the company has built a system that has exempted high-profile users from some or all of its rules, according to company documents reviewed by The Wall Street Journal.The program, known as “cross check” or “XCheck,” was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists. Today, it shields millions of VIP users from the company’s normal enforcement process, the documents show. Some users are “whitelisted”—rendered immune from enforcement actions—while others are allowed to post rule-violating material pending Facebook employee reviews that often never come.At times, the documents show, XCheck has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users. In 2019, it allowed international soccer star Neymar to show nude photos of a woman, who had accused him of rape, to tens of millions of his fans before the content was removed by Facebook. Whitelisted accounts shared inflammatory claims that Facebook’s fact checkers deemed false, including that vaccines are deadly, that Hillary Clinton had covered up “pedophile rings,” and that then-President Donald Trump had called all refugees seeking asylum “animals,” according to the documents.A 2019 internal review of Facebook’s whitelisting practices, marked attorney-client privileged, found favoritism to those users to be both widespread and “not publicly defensible.”

A confidential source admitted, “We are not actually doing what we say we do publicly.”

The source described Facebook’s actions as “a breach of trust.”

“Unlike the rest of our community, these people can violate our standards without any consequences,” the source added.

Someone introduce this source to YouTube and Twitter.

Anyway, XCheck has at least 5.8 million people in 2020. Facebook has “invisible elite tiers within the social network” because it had a hard time “to accurately moderate a torrent of content and avoid negative attention.”

Facebook even misled its own Oversight Board, which it “created to ensure the accountability of the company’s enforcement systems.”

WSJ said the documents show that those at Facebook know “in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.”

It seems everyone in the company knows, including CEO Mark Zuckerberg:

Sometimes the company held back for fear of hurting its business. In other cases, Facebook made changes that backfired. Even Mr. Zuckerberg’s pet initiatives have been thwarted by his own systems and algorithms.The documents include research reports, online employee discussions and drafts of presentations to senior management, including Mr. Zuckerberg. They aren’t the result of idle grumbling, but rather the formal work of teams whose job was to examine the social network and figure out how it could improve.They offer perhaps the clearest picture thus far of how broadly Facebook’s problems are known inside the company, up to the CEO himself. And when Facebook speaks publicly about many of these issues, to lawmakers, regulators and, in the case of XCheck, its own Oversight Board, it often provides misleading or partial answers, masking how much it knows.

Special Treatment

I have a few friends who regularly get muted or suspended from Facebook. We often wonder why or how. Now we know:

Sometimes the company’s automated systems summarily delete or bury content suspected of rule violations without a human review. At other times, material flagged by those systems or by users is assessed by content moderators employed by outside companies.Mr. Zuckerberg estimated in 2018 that Facebook gets 10% of its content removal decisions wrong, and, depending on the enforcement action taken, users might never be told what rule they violated or be given a chance to appeal.

But if you are whitelisted, none of this applies to you. Qualifications include “newsworthy,” “influential or popular,” and “PR risky.”

WSJ brought up Brazillian soccer star Neymar da Silva Santos Jr.:

Neymar, the Brazilian soccer star whose full name is Neymar da Silva Santos Jr., easily qualified. With more than 150 million followers, Neymar’s account on Instagram, which is owned by Facebook, is one of the most popular in the world.After a woman accused Neymar of rape in 2019, he posted Facebook and Instagram videos defending himself—and showing viewers his WhatsApp correspondence with his accuser, which included her name and nude photos of her. He accused the woman of extorting him.Facebook’s standard procedure for handling the posting of “nonconsensual intimate imagery” is simple: Delete it. But Neymar was protected by XCheck.For more than a day, the system blocked Facebook’s moderators from removing the video. An internal review of the incident found that 56 million Facebook and Instagram users saw what Facebook described in a separate document as “revenge porn,” exposing the woman to what an employee referred to in the review as abuse from other users.“This included the video being reposted more than 6,000 times, bullying and harassment about her character,” the review found.Facebook’s operational guidelines stipulate that not only should unauthorized nude photos be deleted, but that people who post them should have their accounts deleted.“After escalating the case to leadership,” the review said, “we decided to leave Neymar’s accounts active, a departure from our usual ‘one strike’ profile disable policy.”

Neymar did not face any charges. The authorities charged the woman with slander, extortion, and fraud. They dropped the first two. She was acquitted of fraud.

What Counts?

Zuckerberg does not want Facebook “to become the arbiters of truth.”

But yet important people get special treatment.

Facebook listens and notices when you complain about them moderating simple messages. But when its human and automated systems flag someone prominent it becomes a mess:

Last year, Facebook’s algorithms misinterpreted a years-old post from Hosam El Sokkari, an independent journalist who once headed the BBC’s Arabic News service, according to a September 2020 “incident review” by the company.In the post, he condemned Osama bin Laden, but Facebook’s algorithms misinterpreted the post as supporting the terrorist, which would have violated the platform’s rules. Human reviewers erroneously concurred with the automated decision and denied Mr. El Sokkari’s appeal.As a result, Mr. El Sokkari’s account was blocked from broadcasting a live video shortly before a scheduled public appearance. In response, he denounced Facebook on Twitter and the company’s own platform in posts that received hundreds of thousands of views.Facebook swiftly reversed itself, but shortly afterward mistakenly took down more of Mr. El Sokkari’s posts criticizing conservative Muslim figures.Mr. El Sokkari responded: “Facebook Arabic support team has obviously been infiltrated by extremists,” he tweeted, an assertion that prompted more scrambling inside Facebook.After seeking input from 41 employees, Facebook said in a report about the incident that XCheck remained too often “reactive and demand-driven.” The report concluded that XCheck should be expanded further to include prominent independent journalists such as Mr. El Sokkari, to avoid future public-relations black eyes.

Whitelisters Posting Misinformation

So how about when whitelisted people share misinformation? This is the best line in the article: “High-profile accounts posed greater risks than regular ones, researchers noted, yet were the least policed.”

It has happened:

In one instance, political whitelist users were sharing articles from alternative-medicine websites claiming that a Berkeley, Calif., doctor had revealed that chemotherapy doesn’t work 97% of the time. Fact-checking organizations have debunked the claims, noting that the science is misrepresented and that the doctor cited in the article died in 1978.

Will It End?

Facebook spokesman Andy Stone called XCheck’s criticism fair, but justified it while insisting Facebook is trying to phase it out:

In a written statement, Facebook spokesman Andy Stone said criticism of XCheck was fair, but added that the system “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.”He said Facebook has been accurate in its communications to the board and that the company is continuing to work to phase out the practice of whitelisting. “A lot of this internal material is outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross check and has been working to address them,” he said.

A product manager on Facebook’s Mistakes Prevention Team admitted it keeps adding to the VIP lists. She has a plan to “stop the bleeding” like “blocking Facebook employees’ ability to enroll new users in XCheck.

I also like this idea:

One potential solution remains off the table: holding high-profile users to the same standards as everyone else.“We do not have systems built out to do that extra diligence for all integrity actions that can occur for a VIP,” her memo said. To avoid making mistakes that might anger influential users, she noted, Facebook would instruct reviewers to take a gentle approach.“We will index to assuming good intent in our review flows and lean into ‘innocent until proven guilty,’ ” she wrote.

Tags: Big Tech, Facebook, Vaccines

CLICK HERE FOR FULL VERSION OF THIS STORY