Facebook’s Oversight Board Has Spoken. But It Hasn’t Solved Much
The Facebook Oversight Board issued its first five decisions Thursday. The rulings are well thought out and show the board members, charged with reviewing Facebook decisions to remove content and make recommendations on Facebook policies, take their job seriously. More than anything, though, they show the futility of moderating content across networks with more than 3 billion users—nearly half the people on earth.
The cases involve posts in five languages, and often, subtleties of meaning and interpretation. Two touch on deep-seated global conflicts: China’s oppression of Uighur Muslims and the ongoing border war between Armenia and Azerbaijan. We’ve long known that the vast majority—now approaching 90 percent—of Facebook users are outside the US, but the breadth of these cases drives home the magnitude of Facebook’s challenge.
Facebook has touted automation as one solution to that challenge, but these cases also highlight the shortcomings of algorithms. In one, Facebook’s automated systems removed an Instagram post in Portuguese from a user in Brazil showing bare breasts and nipples. But the post was an effort to raise awareness about breast cancer, an exception to Facebook’s general policy against nudity, and an issue that has bedeviled Facebook for a decade. To its credit, Facebook restored the post before the Oversight Board heard the case; but it still underscores problems with letting algorithms do the work. In the other case, involving a quote purportedly from Nazi propaganda chief Joseph Goebbels, Facebook’s memory feature had actually recommended that the user recirculate a post from two years earlier. The older post had presumably been allowed to remain, raising questions about the consistency of Facebook’s standards for reviewing content.
Facebook announced the creation of the board in 2018, after years of criticism about its role in fomenting ethnic hatred, political misinformation, and other evils. It took almost two years to assemble the 20 members, whose rulings on specific pieces of Facebook content are supposed to be binding.
In a statement Thursday, Monika Bickert, Facebook’s vice president for content policy, said the company would follow the board’s decisions to restore four items, including the Instagram post from Brazil. The board also suggested changes in Facebook policies, which the company is supposed to reply to within 30 days. Bickert said the recommendations “will have a lasting impact on how we structure our policies.”
In one case, though, she left some doubt. The board recommended that Facebook inform users when their content is removed by an algorithm, and allow for appeals. Bickert said the company expects to take longer than 30 days to respond to this recommendation.
Thursday’s cases may have been relatively easy ones. Coming soon: the politically fraught decision of whether to restore Donald Trump’s account, which is sure to anger a bloc of Facebook users (and employees) no matter how it is decided. Facebook punted that decision to the board last week.
Taken together, the cases decided Thursday reveal the enormity of Facebook’s challenge. Social media management company Social Report estimated in 2018 that Facebook users post 55 million status updates and 350 million photos every day; they send 9 million messages an hour and share 3 million links.
A decision on any one of those posts can be enormously complex. In October, a user in Myanmar, writing in Burmese, posted photographs of a Syrian Kurdish child who drowned attempting to reach Europe in 2015, and contrasted the reaction to the photo to what the user said was a “lack of response by Muslims generally to the treatment of Uighur Muslims in China.”
Source link