LINKEDINCOMMENTMORE

Following the first coronavirus-related fatality in Israel — an 88-year-old Holocaust survivor — Twitter was flooded with celebratory tweets, including one gloating, “I’m going to have a big dinner when the death toll reaches 1,000.”

In response to the disease reaching Gaza, Ariel Gold, head of the controversial activist group CODEPINK, tweeted: “Israel is culpable for every coronavirus death in Gaza.” Former KKK leader David Duke posted a question about the coronavirus earlier this month: “Are Israel and the global Zionist elite up to their old tricks?”

These examples are representative of a broader, disturbing trend taking place online. As COVID-19 has expanded, the Anti-Defamation League reported a significant increase in posts insinuating Jewish involvement in creating and spreading the virus.

Scapegoating Jews for the ills of the world is nothing new. From blaming Jews for the spread of the Black Plague to comparing the only Jewish state to Nazism, anti-Semitism has had many faces. But while the COVID-19 pandemic may be awakening age-old anti-Semitic tropes, that doesn’t mean we can’t take action to stop it.

The most significant step social media companies can take right now to address increasing anti-Semitism on their platforms is to adopt the International Holocaust Remembrance Alliance’s (IHRA) definition of anti-Semitism. Established to build global consensus on what constitutes anti-Semitism, this definition is officially recognized by dozens of countries worldwide. Nations which adopt the IHRA definition also appoint an official who is responsible for tackling anti-Semitism at the national level.

While it has been encouraging to see international bodies, political parties and governments adopt the IHRA framework, the rampant anti-Semitism on social media demonstrates a moral and tactical failing to properly address this issue. The IHRA framework provides a path forward.

Facebook and Twitter’s content policies show how a lack of clarity has been detrimental, and where more clarity would help roll back hatred and even prevent violence. 

Facebook’s Community Standards detail its “hate speech” policy, which bars attacks on groups of people based on race, religion, sexual orientation and other protected categories. But while some anti-Semitic content falls under their definitions, conspiracy theories and anti-Semitic smears replacing the word “Jews” with “Zionists” do not.

Twitter’s rules specifically prohibit the use of Holocaust imagery in targeted harassment, as well as hate symbols such as swastikas. But while Twitter is careful about content related to the Holocaust, its rules do not prohibit anti-Semitic conspiracies, which rise in popularity whenever social or economic tensions are high.

The account of notorious anti-Semite Louis Farrakhan, for example, is still active, despite Twitter’s decision to remove his “verified” badge and broaden its policies to include “dehumanizing” language against religious groups. 

Incorporating the IHRA definition directly into their policies and assigning a staff member responsible for overseeing its enforcement could help these platforms tackle anti-Semitic conspiracies in two main ways. First, by properly defining anti-Semitism. Under the IHRA framework, for instance, blaming “Zionists” for biological contagious diseases is clearly anti-Semitic. But under current social media policies, it’s unclear at best if these posts violate the terms of service.

Second, social media companies can implement proactive tools to remove anti-Semitic content — such as by flagging certain topics or keywords for review — before it spreads. Today, the system requires users to report content, which is only then evaluated by an algorithm or another human being — a reactive approach to a problem that requires both a reactive and proactive response. 

Social media companies that give bigoted individuals easy access to a wide audience have a moral obligation to ensure that dangerous rhetoric isn’t allowed to thrive on their platforms. Failing to do so makes them complicit, at least in part, in the anti-Semitic violence we see from those radicalized by conspiracy theories spread online.

Emily Schrader is CEO of digital marketing firm Social Lite Creative and an associate fellow at the J’accuse Coalition for Justice.

LINKEDINCOMMENTMORE
Read or Share this story: https://www.detroitnews.com/story/opinion/2020/04/02/opinion-virus-brings-spike-anti-semitic-posts/5111347002/