It took a riotous mob storming the Capitol building for Facebook to finally take significant action against the president.
After four years of race-baiting, lies and hatred spewed from President Trump’s social media accounts, Facebook took the commendable step on Thursday of blocking his account indefinitely. His access won’t be reinstated until at least after Joe Biden is inaugurated, the company said.
“The current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government,” wrote Facebook’s founder and C.E.O., Mark Zuckerberg. “We believe the risks of allowing the President to continue to use our service during this period are simply too great.”
Even as violent Trump supporters overtook the Capitol building on Wednesday, the social media companies’ initial instincts were to leave the president’s posts up — posts in which Mr. Trump expressed sympathy with members of the Capitol mob and continued to lie about the outcome of the November election. Millions of people read and shared them.
It’s time for a fundamental rewiring of those instincts. Twitter, shamefully, has yet to take harsher action against Mr. Trump than a 12-hour suspension. It must also act to indefinitely block Mr. Trump’s account. What Mr. Trump has exposed is the companies’ willingness to look the other way so long as it leads to more clicks, more time spent on their platforms, more shares. He won’t be the last to exploit those tendencies.
Let’s not forget that Facebook let stand Mr. Trump’s apparent threat to protesters after the killing of George Floyd: “when the looting starts, the shooting starts.” Mr. Trump has used the platforms to downplay the risk of the coronavirus, advance conspiracy theories about his enemies, threaten world leaders and undermine the results of the election. Letting posts like that stay up on the social media platforms sets a dangerous precedent for future politicians and others who would seek to stir up the masses.
Average users who repeatedly violate the companies’ policies, particularly by inciting violence, are blocked or deleted immediately. Yet the president was given a pass, again and again.
Unfortunately, the companies are likely to continue to allow posts to stay up from other leaders spewing rancor and misinformation when such posts are deemed to have news value. That means personalities who have a large audience of people who look up to and believe them have a lower bar to clear than the general public for spouting conspiracy theories and lies. That’s the opposite of how things should be.
“We believe that the public has a right to the broadest possible access to political speech, even controversial speech,” Mr. Zuckerberg wrote on Thursday. But that’s a red herring. Facebook’s lawyers know that it is well within the company’s rights to remove or block content. The president, of all people, has many other ways to reach the electorate.
Social media sites are where people go to find compatriots and plan attacks like the thwarted attempt last year to kidnap Michigan’s governor. YouTube’s algorithm has radicalized countless youth. And this week’s rioters used right-leaning social media sites like Parler to pass around directions to the Capitol that would help them avoid police detection.
So we have to ask Facebook and Twitter: Is this who you really want to be? A safe space for riot-inciting, lying public officials, who nonetheless bring in lots of online engagement?
Mr. Zuckerberg said on Wednesday in an internal note that the company would increase its moderation of the president’s account because of the “emergency” of the day’s mob violence. That’s a telling admission that Facebook hadn’t already been devoting enough resources to moderating Mr. Trump’s dangerous account.
Jan. 6, 2021, ought to be social media’s day of reckoning. There is a greater calling than profits, and Mr. Zuckerberg and Twitter’s C.E.O., Jack Dorsey, must play a fundamental role in restoring truth and decency to our democracy and democracies around the world.
That can involve more direct, human moderation of high-profile accounts; more prominent warning labels; software that can delay posts so that they can be reviewed before going out to the masses, especially during moments of high tension; and a far greater willingness to suspend or even completely block dangerous accounts like Mr. Trump’s.
“The companies can exclude whatever content they wish,” said Richard Hasen, a professor of law and political science at the University of California, Irvine, who studies online misinformation. “There is nothing in the law that says they have an obligation to give him a loudspeaker with no mediation.”
Their obligations shouldn’t stop at the president. Accounts like that of the former Trump adviser Steve Bannon, who called in a video posted to Facebook for the beheading of Dr. Anthony Fauci and F.B.I. Director Christopher Wray, deserve similar treatment. (Mr. Zuckerberg said at the time that Mr. Bannon hadn’t violated the rules enough times to be banned.)
There are risks, of course, to more aggressively policing the social media sites. Shareholders may balk at the prospect of a ding to ad sales, even if these corporations would most likely remain among the most profitable in the world. Lawmakers may retaliate against companies that block users. They’ve hauled the companies’ executives before Congress repeatedly and threatened to bully the platforms through more stringent rules, particularly through fully revoking the legal shield that allows them to host most of the content users generate.
When Joe Biden takes office, he should be held to the same standard as regular Joes. If he starts tweeting or posting lies or inciting violence, the companies can and should quickly suspend and remove him.
Facebook deserves credit for making this decision, but America cannot risk a repeat of the events in the Capitol. If the companies once again wait until violence breaks out to act, it will be too late.