Digital Originals

published : 2023-12-10

Surge of 'pro-terror,' antisemitic content online renews debate over free speech on the internet

Tech company that tracks antisemitism blames Section 230 for 'toxic' relationship between US and social media

A photo of a protest against hate speech, with people holding signs advocating for free speech and condemning antisemitic content (taken with Canon EOS 5D Mark IV).

A recent surge in 'pro-terror content' amid the Israel-Hamas war has sparked a contentious debate over the limits of free speech on the internet.

As social media platforms become inundated with antisemitic and violent content following Hamas' attack on Israel, concerns about the government's role in policing online speech have resurfaced.

Tal-Or Cohen Montemayor, the founder of CyberWell, an Israeli tech nonprofit that monitors antisemitic speech online, warns that the intersection of antisemitism, radical Islamic ideology, and pro-terror content poses a risk to Western democracies, including the United States.

Montemayor describes the recent surge in online antisemitism and pro-terror content as the 'largest hijacking of social media platforms by a terrorist organization,' emphasizing the alarming spread of misinformation and propaganda.

To combat this issue, CyberWell utilizes AI technology to analyze and flag content that is highly likely to be antisemitic. Their recent report reveals an 86% increase in such content since October 7, highlighting the urgent need for regulation.

A close-up shot of a person using social media platform, highlighting the issue of online radicalization and the spread of pro-terror content (taken with Nikon D850).

Montemayor points out that current legislation, specifically Section 230 of the Communications Decency Act of 1996, allows social media platforms to avoid liability for user-generated content, a significant obstacle in addressing the toxic relationship between the US and social media.

Section 230 states that 'no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.' While some argue that this provision protects free speech, others like Montemayor believe it perpetuates a toxic online environment.

Adrian Moore, vice president of policy at Reason Foundation, a libertarian think tank, supports Section 230 as it keeps online speech free. He suggests that discussions around limiting offensive content promote subjective censorship and that private companies are better equipped to moderate their platforms.

Despite their disagreement on Section 230, Montemayor and Moore recognize the necessity for change. Montemayor proposes implementing a system similar to the European Union's Digital Services Act, which requires online platforms to take responsibility for monitoring and transparency regarding illegal content.

Notably, Montemayor calls for transparent access to data showing how hate and pro-terror content is reported and handled by social media platforms, even if Section 230 remains unchanged.

An image of a diverse group of individuals engaged in a constructive online dialogue, representing the importance of open discussions and combating extremist ideologies (taken with Sony Alpha a7 III).

Concerns about social media algorithms leading to 'silo' effects and reinforcing biased or extremist content are also raised. Montemayor warns that when algorithms continually expose younger generations to anti-American trends, it becomes a cause for concern.

Moore cautions against increased regulation, arguing that defining what is 'bad' content is subjective and susceptible to abuse. However, he believes that social media companies have an economic incentive to moderate offensive posts, as such content can repel users.

In conclusion, the surge in antisemitic and pro-terror content online has reignited the debate on free speech on the internet. While differing opinions exist on how to address this issue, it is crucial to find a balance that protects freedom of expression while tackling harmful content and its impact on society.