More than a year after Meta asked the Oversight Board to weigh in on its cross-check rules, the group has finally published its full policy advisory on the topic. The board found that the program, which creates a separate content moderation process for certain high-profile users, prioritizes the company’s business over the rights of its users.
“In our review, we found several shortcomings in Meta’s cross-check program,” the board writes in its assessment. “While Meta told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns.” Notably, the critique echoes that of whistleblower Frances Haugen, who revealed explosive details about cross-check last year, and has said that Meta “chooses profits over safety.”
Cross-check, or xcheck, is an internal program at Facebook and Instagram that shields celebrities, politicians, and other high-profile users from the company’s automated content moderation systems. Meta has characterized it as a “second layer of review” to avoid mistakenly removing posts. But disclosures made by Haugen showed the program includes millions of accounts, and has enabled billions of views on posts that would have otherwise been taken down. The Oversight Board itself has accused Meta of being not “fully forthcoming” about the program, which was a central issue in the board’s handling of the suspension of former President Donald Trump.
The Oversight Board’s policy advisory opinion, or PAO, on the program is the most detailed to look to date at Meta’s evolving cross-check rules. The board writes at length about two separate cross-check processes: Early Response Secondary Review (ERSR), which is reserved for certain high-profile users determined by Meta, and General Secondary Review (GSR), a newer system that uses an algorithm to automatically flag some types of posts from across its platform for additional review. GSR, which can apply to content from any Facebook or Instagram user, began in 2021 “in response to criticism” related to Haugen’s disclosures in the Facebook Papers.
But according to the Oversight Board, both cross-check systems have serious issues. Both operate with a “consistent backlog of cases,” which lengthens the amount of time potentially rule-breaking content is left up. “Meta told the Board, that, on average, it can take more than five days to reach a decision on content from users on its cross-check lists,” the group notes. “This means that, because of cross-check, content identified as breaking Meta’s rules is left up on Facebook and Instagram when it is most viral and could cause harm.”
The board sheds new light on one such case, pointing to a 2019 incident in which Brazilian soccer star Neymar posted a video showing nude photos of a woman who had accused of him of sexual assault. Because of cross-check, the post was left up for more than a day and received more than 56 million views before it was ultimately removed. In its opinion, the board raises questions about why the athlete was not suspended, and pointedly notes that the incident only came to light as a result of Haugen’s disclosures.
“The company ultimately disclosed that the only consequence was content removal, and that the normal penalty would have been account disabling … Meta later announced it signed an economic deal with Neymar for him to ‘stream games exclusively on Facebook Gaming and share video content to his more than 166 million Instagram fans.’”
The Oversight Board is similarly critical of other “business” factors that play a role in Meta’s cross-check rules. For example, it says Meta skews toward under-enforcement of cross-checked content due to the “perception of censorship” and the effect it could have on the company. “The Board interprets this to mean that, for business reasons, addressing the ‘perception of censorship’ may take priority over other human rights responsibilities relevant for content moderation,” the group writes.
In a statement, Meta’s policy chief Nick Clegg said the company has made improvements to cross-check, including the addition of annual reviews, but did not directly address the board’s criticism that the program is designed to protect the company’s business interests. “We built the cross-check system to prevent potential over-enforcement (when we take action on content or accounts that don’t actually violate our policies) and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe,” Clegg wrote.
Unsurprisingly, the board had numerous recommendations for Meta on how to improve cross-check. The board says Meta should use “specialized teams independent from political or economic influence, including from Meta’s public policy teams,” to determine which accounts get cross-check protections. It also suggests that there should be a “transparent strike system” to revoke cross-check status from accounts that abuse the company’s rules.
The board also recommends that Meta inform all accounts that are part of cross-check, and “publicly mark the pages and accounts of entities receiving list-based protection in the following categories: all state actors and political candidates, all business partners, all media actors, and all other public figures included because of the commercial benefit to the company.” It also wants Meta to track and report key statistics about cross-check accuracy, and take steps to eliminate the backlogs in cases.
In total the Oversight Board came up with 32 detailed recommendations, which Meta will now have 90 days to respond to. As with other policy suggestions from the board, the company isn’t obligated to implement any of its suggestions, though it is expected to respond to each one.
Update 12/6 11:17 AM ET: Added Meta’s statement from Nick Clegg about cross-check.
Correction: The Neymar video received more than 56 million views before its removal, not 100 million.