TechTech newsTechnology

Admit It: The Facebook Oversight Board Is Kind of Working

Judging from the press releases filling my inbox and the tweets lighting up my timeline, no one is happy with Facebook right now. On Friday, the company issued its response to the Facebook Oversight Board’s recommendations on the indefinite ban of Donald Trump. We learned that Trump’s account is now frozen for precisely two years from his original January 7 suspension date, at which point Facebook will reassess the risks of letting him back on. The response also includes a number of other policy changes. Opinions on the announcement range from calling it a pointless bit of “accountability theater” to suggesting that it’s cowardly and irresponsible. Republicans are, of course, outraged that Trump hasn’t been reinstated.

I confess to finding myself in a different camp. The Oversight Board is performing a valuable, though very limited, function, and the Trump situation illustrates why.

When the board first published its ruling last month, it issued both a binding command—Facebook must articulate a specific action on Donald Trump’s account and could not continue an indefinite suspension—and nonbinding recommendations, most notably that the platform abandon its policy of treating statements by politicians as inherently “newsworthy” and thus exempt from the rules that apply to everyone else. As I wrote at the time, Facebook’s response to the nonbinding part would probably prove more important. It would apply more broadly than to just Trump’s account, and it would show whether the company is willing to follow the Oversight Board’s advice even when it doesn’t have to.

Now we know that the answer to that last question is yes. In its announcement on Friday, Facebook says it is committed to fully following 15 of the 19 nonbinding recommendations. Of the remaining four, it is rejecting one, partially following another, and doing more research on two.

The most interesting commitments are around the “newsworthiness allowance.” Facebook says it will keep the exception in place, meaning it will still allow some content that violates its Community Standards to stay up if it is “newsworthy or important to the public interest.” The difference is that the platform will no longer treat posts by politicians as more inherently newsworthy than posts by anyone else. It is also increasing transparency by creating a page explaining the rule; beginning next year, it says it will publish an explanation each time the exception is applied to content that otherwise would have been taken down.

Let this sink in for a moment: Facebook took detailed feedback from a group of thoughtful critics, and Mark Zuckerberg signed off on a concrete policy change, plus some increased transparency. This is progress!

Now, please don’t confuse this for a complete endorsement. There is plenty to criticize about Facebook’s announcement. On the Trump ban, while the company has now articulated more detailed policies around “heightened penalties for public figures during times of civil unrest and ongoing violence,” the fact that it came up with a two-year maximum suspension seems suspiciously tailored to potentially allow Trump back on the platform just when he’s getting ready to start running for president again. And Facebook’s new commitments to transparency leave much to be desired. Its new explanation of the newsworthiness allowance, for example, provides zero information about how Facebook defines “newsworthy” in the first place—a pretty important detail. Perhaps the case-by-case explanations beginning next year will shed more light, but until then the policy is about as transparent as a fogged-over bathroom window.

Indeed, as with any announcement from Facebook, this one will be impossible to evaluate fully until we see how the company follows through in practice. In several cases, Facebook claims that it’s already following the Oversight Board’s recommendations. This can strain credulity. For instance, in response to a suggestion that it rely on regional linguistic and political expertise in enforcing policies around the world, the company declares, “We ensure that content reviewers are supported by teams with regional and linguistic expertise, including the context in which the speech is presented.” And yet a Reuters investigation published this week found that posts promoting gay conversion therapy, which Facebook’s rules prohibit, continue to run rampant in Arab countries, “where practitioners post to millions of followers through verified accounts.” As the content moderation scholar Evelyn Douek puts it, with many of its statements “Facebook gives itself a gold star, but they’re really borderline passes at best.”


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button