Mark Zuckerberg says Facebook will ‘review’ policies on speech promoting state violence

Facebook CEO Mark Zuckerberg released a lengthy statement on his personal page on Friday saying he supports the Black Lives Matter movement and will begin engaging in a series of reviews of company policy. Specifically, Zuckerberg says he and company leadership will review its controversial stance around “threats of state use of force,” following President Donald Trump’s statement about shooting protesters that sparked outrage and various levels of response from both Facebook and Twitter.

The post largely repeated points Zuckerberg made in an all-hands meeting earlier this week,
“We’re going to review our policies allowing discussion and threats of state use of force to see if there are any amendments we should adopt. There are two specific situations under this policy that we’re going to review,” Zuckerberg writes. “The first is around instances of excessive use of police or state force. Given the sensitive history in the US, this deserves special consideration. The second case is around when a country has ongoing civil unrest or violent conflicts.”

He also ended the note by writing, “To members of our Black community: I stand with you. Your lives matter. Black lives matter,” making him Zuckerberg one of the few tech leaders to personally avow support for the movement outside company statements and donations. Shortly after Zuckerberg’s post, Amazon CEO Jeff Bezos shared a post on his Instagram account also pledging support for the movement and detailing an email exchange in which he explains the meaning of the phrase to a customer who complained about Amazon’s Black Lives Matter website banner.

Zuckerberg has spent the last few days defending his decision not to take action against a Trump post in which the president wrote, “When the looting starts, the shooting starts.” Twitter, which had just prior fact-checked the president’s false statements about mail-in ballots, restricted the tweet in an unprecedented move, ensuring it would be labeled as “glorifying violence” and disabling the ability to retweet or comment on it. Facebook, on the other hand, left up the post containing identical language.

“I know many people are upset that we’ve left the President’s posts up, but our position is that we should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers spelled out in clear policies,” Zuckerberg said late last week in a Facebook post clarifying his position. The public and employee response has been widespread outrage, with employees staging their first ever walkout on Monday of this week and dozens of former employees writing an open letter condemning Zuckerberg’s decision. The situation has even led to some high-profile resignations.

In his new Friday evening post, Zuckerberg says the company will “review our policies around voter suppression to make sure we’re taking into account the realities of voting in the midst of a pandemic.” He specifically cites potential misinformation, like the kind Trump tweeted out that led to Twitter’s fact-checking note, around mail-in voting and trying to better clarify what the line is “between a legitimate debate about the voting policies and attempts to confuse or suppress individuals about how, when or where to vote.”

Zuckerberg also says Facebook will be reviewing how it handles violating content that depart from its binary, leave-it-up or take-it-down approach. “I know many of you think we should have labeled the President’s posts in some way last week. Our current policy is that if content is actually inciting violence, then the right mitigation is to take that content down — not let people continue seeing it behind a flag,” Zuckerberg writes. “There is no exception to this policy for politicians or newsworthiness. I think this policy is principled and reasonable, but I also respect a lot of the people who think there may be better alternatives, so I want to make sure we hear all those ideas.”

Additionally, Facebook will work to improve the transparency around how it makes these decisions and whether it can “change anything structurally to make sure the right groups and voices are at the table” when it does make a definitive choice around a controversial speech and moderation issue.

Important context here is that Facebook’s workforce is composed of less than 10 percent black and Hispanic employees. In 2018, a black employee, Mark Luckie, quit over what he publicly said was Facebook’s “black people problem,” referencing the company’s lip service regarding racial diversity and inclusion efforts that Luckie said rarely translated to meaningful change.

© 2019. All rights reserved.