Facebook parent Meta’s quasi-independent oversight board said Tuesday that an internal system that exempted high-profile users, including former U.S. President Donald Trump, from some or all of its content moderation rules needs a major overhaul.
The report by the Oversight Board, which was more than a year in the making, said the system “is flawed in key areas which the company must address.”
The board opened its review after The Wall Street Journal reported last year that it was being abused by many of its elite users, who posted material that would result in penalties for ordinary people, including for harassment and incitement of violence.
Facebook’s rules reportedly didn’t seem to apply to some VIP users while others faced reviews of rule-breaking posts that never happened, according to the Journal article, which said the system had at least 5.8 million exempted users as of 2020.
The Oversight Board’s report said that the system — known as “XCheck,” or cross-check — resulted in users being treated unequally and that it led to delays in taking down content that violated the rules. Decisions on average took up to five days, it found.
Among its 32 recommendations, the board said Meta “should prioritize expression that is important for human rights, including expression which is of special public importance.”
Users who are “likely to produce this kind of expression” should be given higher priority than others who are on the cross-check list because they are business partners, the report said.
Ending VIP “special protection”
“If users included due to their commercial importance frequently post violating content, they should no longer benefit from special protection,” the board said.
Addressing other flaws, the board also urged Meta to remove or hide content while it’s being reviewed and said the company should “radically increase transparency around cross-check and how it operates,” such as outlining “clear, public criteria” on who gets to be on the list.
Nick Clegg, Meta’s global vice president for public affairs, tweeted that the company requested the review “so that we can continue our work to improve the program.”
To fully address the board’s recommendations, “we’ve agreed to respond within 90 days,” he added.
The board upheld Facebook’s decision tolast year out of concern he incited violence leading to the riot on the U.S. Capitol. But it said the company failed to mention the cross-check system in its request for a ruling.