Trump’s Facebook Ban Is Upheld—For Now
The Facebook Oversight Board is often described as a “Supreme Court” for Facebook. On Wednesday, it acted like it—issuing a finely grained ruling that punts the hardest question posed to it back down for Mark Zuckerberg to deal with.
The issue before the board, in case you haven’t turned on the news or checked Twitter this week, was whether to uphold Facebook’s indefinite ban of Donald Trump’s account following his role in inciting the January 6 riot at the Capitol. It was, by far, the most hotly anticipated decision in the Oversight Board’s young existence. Since the company referred the case to the board on January 21, it received over 9,000 public comments on the matter. As of Wednesday, the Trump ban remains in place—but the decision still isn’t final.
Specifically, Facebook asked the Oversight Board to decide:
Considering Facebook’s values, specifically its commitment to voice and safety, did it correctly decide on January 7, 2021, to prohibit Donald J. Trump’s access to posting content on Facebook and Instagram for an indefinite amount of time?
The board’s answer was yes—and no. Yes, Facebook was right to suspend Trump’s account; no, it was wrong to do so indefinitely. “In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities,” the board wrote in its decision. “The Board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.” In other words, Facebook must decide whether to let Trump back immediately, place a clear end date on his suspension, or kick him off its platforms forever.
While the board took Facebook to task for refusing to take a clearer stand, it also endorsed the immediate logic of the takedown. The original decision to deactivate Trump’s account was made under extraordinary circumstances. With the violent attack on the US Capitol still raging, Trump made a series of posts, including a video, in which he told his followers to go home—but in which he also repeated the false claim that the election had been stolen, the very idea motivating his rioting supporters. “This was a fraudulent election, but we can’t play into the hands of these people,” he said in the video. “We have to have peace. So go home. We love you. You’re very special.” By the next day, Facebook had taken the posts down and suspended Trump entirely from its platform, as well as Instagram and WhatsApp. (Twitter and YouTube did likewise.)
It was clear all along that the content of the offending posts was far from Trump’s most egregious—after all, he was at least telling the rioters to go home—and didn’t obviously violate any clear rule. Trump had been using Facebook to broadcast the stolen-election myth for months, after all. What had changed was not Trump’s online behavior, then, but the offline consequences of it. In a blog post explaining Facebook’s decision, Mark Zuckerberg tacitly recognized as much. “We removed these statements yesterday because we judged that their effect—and likely their intent—would be to provoke further violence,” he wrote. While the platform previously tolerated Trump, “the current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government.” Trump would remain banned “indefinitely and for at least the next two weeks until the peaceful transition of power is complete.”
The decision was a striking departure from Facebook’s normal approach to moderation in two ways. First, the company explicitly looked not just at the content of the posts, but at the real-world context. Second, it departed from its “newsworthiness” rule that generally gives political leaders extra leeway to break the rules, on the theory that people deserve to know what they have to say.
Source link