Facebook CEO Mark Zuckerberg is seen testifying remotely during a Senate Judiciary Committee hearing that is looking into how Facebook and Twitter moderated content during the U.S. presidential election in November. File Pool Photo by Hannah McKay/UPI | License Photo
Oct. 21 (UPI) -- Facebooks' oversight board found in reports published Thursday that the platform lacks transparency about a program shielding high-profile users from content moderation rules.
The finding came from the independent board's first quarterly transparency reports, which covered the fourth quarter of 2020 and the first two quarters of 2021. And it follows the Wall Street Journal reporting last month that Facebook's cross-check shields millions of celebrities, politicians and other high-profile users from the rules.
"Today's reports conclude that Facebook has not been fully forthcoming with the board on its 'cross-check' system, which the company uses to review content decisions related to high-profile users," the board said in the statement Thursday. "The board has also announced that it has accepted a request from Facebook, in the form of a policy advisory opinion, to review its cross-check system and make recommendations on how it can be changed.
"As part of this review, Facebook has agreed to share with the board documents concerning the cross-check system as reported in the Wall Street Journal," the statement added.
The review will also engage academic experts and researchers and call for public comments, according to the transparency reports.
The board, which is made up of several experts in the areas of human rights, freedom of expression and the law, said Facebook failed to provide relevant information on cross-check at certain times and other times provided information that was incomplete.
In particular, the board criticized Facebook for failing to mention cross-check upfront when it referred a case related to former President Donald Trump to the board.
"This omission is not acceptable," the transparency reports said. "Facebook only mentioned cross-check to the board when we asked whether Mr. Trump's page or account had been subject to ordinary content moderation process."
The board said the Trump case was the board's first decision related to cross-check.
Facebook banned Trump indefinitely after the Jan. 6 Capitol attack by his supporters.
The ban also includes Instagram, which Facebook owns.
In May, the board upheld the tech giant's decision to restrict Trump from posting on Facebook and Instagram, but also raised concern about the "indefinite suspension" penalty.
"Facebook's normal penalties include removing the violating content, imposing a time-bound period of suspension, or permanently disabling the page and account," the board wrote then. "The board insists that Facebook review this matter to determine and justify a proportionate response that is consistent with the rules that are applied to other users of its platform. Facebook must complete its review of this matter within six months."
The board also noted in its Trump decision that there is "limited public information on cross-check," and that "the lack of transparency regarding these decision-making processes appears to contribute to perceptions that the company may be unduly influenced by political or commercial considerations."
The Trump decision also raised concerns about applying special content moderation rules to high-profile users, which may be abused, as the Journal reported.
Facebook told the board that it applies the "cross-check" system to some "high-profile" accounts to "minimize the risk of errors in enforcement." Under the cross-check system, content found to violate the tech giant's community standards is sent for additional internal review, and then Facebook decides if the content is in violation.
Facebook spokesman Andy Stone told the Journal that criticism of cross-check was fair, but the system "was designed for an important reason: to create an additional step so we can accurately enforce polices on content that could require more understanding."
Over half a million user appeals were submitted to the board between October 2020 and June 2021, the transparency reports show. Based on 11 cases decided by the end of June, the board overturned Facebook's decision eight times and upheld it three times. The board made 52 recommendations based on these decisions, and received 9,842 public comments with the vast majority of them related to the Trump decision.