In the wake of the Christchurch shootings, social media platforms and governments alike have been struggling with how to deal with terrorist content online, and this week saw their first concrete response. On Sunday, New Zealand Prime Minister Jacinda Ardern announced a new commitment called the “Christchurch Call,” which calls on tech platforms and governments to adopt and enforce laws for removing extremist content.
The call is already attracting support from countries like France, Australia, Canada, and the United Kingdom as well as tech companies like Facebook, Twitter, Google, and Microsoft. But at least one country is refusing to sign on to the agreement: the United States.
In a statement issued today, the White House said that it will “stand with the international community in condemning terrorist and extremist content” and thanked both Ardern and French president Emmanuel Macron for their effort but said the US was not “currently in a position to join the endorsement.”
The White House did not specifically explain why it was incapable of signing on, but the statement suggests that it may be connected to the broader right wing concerns over deplatforming.
Earlier this month, Facebook banned far-right commentators and conspiracy theorists like Alex Jones and Milo Yiannopoulos from its platforms, a move that drew intense criticism from the president’s son. The House Judiciary Committee held its own hearing on “Hate Crimes and the Rise of White Nationalism” in April, inviting right-wing activist Candace Owens to testify. Despite attempting to focus the hearing on hate crimes, Republicans quickly redirected it to concerns over anti-conservative bias.
“We continue to be proactive in our efforts to counter terrorist content online,” the White House statement reads, “while also continuing to respect freedom of expression and freedom of the press.”
The new call to action is named after Christchurch shootings, in which white nationalists killed more than 51 people in a mosque in Christchurch, New Zealand. The attackers were notable for their chilling use of digital media, live-streaming the shooting to Facebook. After the attack, platforms worked to remove reuploads of the video from the platform, but versions of it were still available to view months after the attack.
The following month, a 19-year-old man opened fire at a synagogue in San Diego, killing one person and injuring three others. This man also frequented these anonymous forums, and the San Diego shooter even thanked others on the site for posting the racist and often violent memes that played a significant role in his radicalization. On Tuesday, Facebook announced new restrictions on live-streaming video, which were intended as a response to the Christchurch Call. Twitter has not committed to any policy changes, but it expressed for the support in a public statement. “It is right that we come together,” the company wrote through its policy account, “to ensure we’re doing all we can to fight the hatred & extremism that lead to terrorist violence.”
Facebook has signed onto the call along with other major tech companies like Microsoft, Google, Amazon, Twitter, and YouTube.