Experts: Hate, extremism on social media spreads amid Israel-Hamas war

U.S. Manager 24/10/2023

The impacts of the war between Israel and Hamas are compounding in the United States, as hateful online rhetoric has homeland security experts concerned about the spread of extremist ideologies in the U.S.

Jewish, Muslim and Arab communities are fearful, and law enforcement agencies are on high alert amid the heightened tensions worldwide.

Online misinformation about the war and these communities is top of mind for many homeland security experts, who say it's playing a big part in inciting extremism, violence and hate.

John Cohen, a former Department of Homeland Security official and ABC News contributor, believes online hate activity has been fueling terrorism and extremism during recent times of political polarization as hateful content oversaturates social media platforms.

"We're an angry nation," he told ABC News.

Individuals who are "angry" and "disaffected" are often targets for online misinformation campaigns that blend ideological beliefs with personal grievances, Cohen said.

Hateful messages can spread quickly online, landing in front of unsuspecting viewers who may not initially even realize they're engaging with anti-Muslim, anti-Palestinian or antisemitic content, according to former DHS official and ABC New contributor Elizabeth Neumann.

MORE: Israel-Gaza live updates: Hundreds of Gaza targets struck overnight, Israel says

Wadea Al-Fayoume, 6, a Muslim boy who, according to police, was stabbed to death in an attack that targeted him and his mother for their religion and as a response to the ongoing conflict between Israel and Hamas, poses in an undated family photograph obtained by Reuters on Oct. 15, 2023. Cair/via Reuters

"It works its way into both dark places on the internet, as well as mainstream platforms," Neumann said.

"It sows grievance, and it creates a moral justification for some form of hostile action against the outgroup -- whoever that is, it could be Muslims, it could be Jews. Anybody that the disinformation is about, it kind of creates that moral justification that violence is necessary," Neumann said.

The Department of Homeland Security has warned that more and more antisemitic and Islamophobic hate attacks have been occurring in the U.S. since the start of the Israel-Hamas war, including the stabbing of a 6-year-old Muslim boy and his mother in Illinois.

DHS Secretary Alejandro Mayorkas recently urged cultural leaders to be cautious of the threat of a "lone wolf, the individual incited to violence by an ideology of hate."

However, Neumann and other experts believe that pinning the threat on lone actors misrepresents the online communities behind the spread of extremist viewpoints.

"Every individual that we can point to that has committed an act of terrorism in the last 10 years has been inspired by a milieu of people with these extremist viewpoints, predominantly the white power movement and then the anti-government extremist movement" in the U.S., she said.

Homeland security experts say these movements are often leaderless by design, with individuals taking action by themselves as opposed to a group carrying out attacks. This makes it more difficult for law enforcement agencies to tackle the growing threat of extremism, Neumann said.

With "millions of people participating in this disgusting rhetoric, which one of them is going to be the one that decides 'today's the day'" to commit violence, Neumann asks.

This is not a new phenomenon, experts point out. But social media has made it easier and faster for hateful rhetoric or conspiracy theories to spread.

Cohen argues that while extremist groups have "embraced the power of the internet and have incorporated internet-based communication capabilities into their tactical operations," U.S. agencies have not adjusted their investigative processes to tackle the growing threat.

A Miami Beach police patrol drives past Temple Emanu-El synagogue in Miami Beach, Fla., on Oct. 9, 2023, after the Palestinian militant group Hamas launched an attack on Israel. Marco Bello/AFP via Getty Images, FILE

The European Commission has formally requested information from several social media giants on their handling of content related to the Israel-Hamas war, according to statements from the commission.

"We are now at a turning point. The rapid evolution of the digital space, combined with the growing threat of terrorism and disinformation, calls for a rapid, decisive and coordinated response," Thierry Breton, EU commissioner, said in a speech delivered to the European Parliament on Wednesday.

MORE: Detroit synagogue president found stabbed to death outside home

According to the commission's statement, Meta is being asked to provide more information on the measures it takes regarding "the dissemination and amplification of illegal content and disinformation."

"We have a well-established process for identifying and mitigating risks during a crisis while also protecting expression," a Meta company spokesperson told ABC News in response to the commission's request.

The company says it established a special operation center after the terrorist attacks by Hamas, and it's staffed with experts, including fluent Hebrew and Arabic speakers.

"Our teams are working around the clock to keep our platforms safe, take action on content that violates our policies or local law, and coordinate with third-party fact-checkers in the region to limit the spread of misinformation," the Meta spokesperson said.

The company added that it's going to respond to the European Commission and is happy to provide further details of its work, beyond what they've already shared.

TikTok is also being asked to provide information as it relates to "the protection of minors online," in addition to turning over information on how it mitigates the spread of "terrorist and violent content and hate speech."

"We'll publish our first transparency report under the DSA [Digital Services Act] next week, where we'll include more information about our ongoing work to keep our European community safe," a TikTok spokesperson told ABC News on Thursday.

TikTok had previously published a statement where it detailed its efforts to limit the spread of shocking or graphic content by adding "opt-in screens" over content to help people from viewing that type of content unless they choose to.

Last week, the commission sent a similar request to X, formerly known as Twitter.

In a post on Oct. 9, the company said it had paid close attention to the outpouring of content on the platform related to the Israel-Hamas war. There were more than 50 million posts globally on the Hamas attack on Israel in the two days after the attack began on Oct. 7, X said.

Melanie Pell, chief field operations officer of the American Jewish Committee, and Abed A. Ayoub, national executive director of the American-Arab Anti-Discrimination Committee, spoke to ABC News about the threats facing their communities. ABC News

"As the events continue to unfold rapidly, a cross-company leadership group has assessed this moment as a crisis requiring the highest level of response," the company said. "This means we're laser-focused and dedicated to protecting the conversation on X and enforcing our rules as we continue to assess the situation on the platform."

Instagram, Facebook and X all have a similar feature giving users the ability to decide what content they want to view, hopefully limiting potentially sensitive content from displaying in search results.

TikTok also says it has added moderators who speak Arabic and Hebrew to review content related to these events. It said it removed, to date, more than 500,000 videos and closed 8,000 livestreams in the impacted region for violating its guidelines.

While these platforms are well-acquainted with how extremists and violent content can spread during times of crisis, experts and users say the sheer volume of content during this war seems to be more than what they've experienced during past conflicts.

Meta, which owns and operates Facebook and Instagram, told ABC News that in the three days following the terrorist attacks of Oct. 7, it removed or marked as disturbing more than 795,000 pieces of content for violating policies in Hebrew and Arabic.

As compared to two months prior, Meta removed seven times as many pieces of content daily for violating its "Dangerous Organizations and Individuals" policy in Hebrew and Arabic alone, it said.

The increase in hate has put leaders in the Arab and Jewish communities in the U.S. on high alert.

Abed A. Ayoub, national executive director of the American-Arab Anti-Discrimination Committee, said his organization is focused on combating the online misconceptions around the Arabic and Muslim communities during this time. Uplifting the voices of Palestinians, particularly those under siege by Israeli forces, and fighting against their erasure is top of mind, he said.

"We really need [people] to educate themselves and listen to Palestinian voices, listen to Arab voices, before casting judgment on an entire community," Ayoub told ABC News.

Melanie Pell, chief field operations officer of the American Jewish Committee, told ABC News that community leaders have long taken any threats to the safety and security of Jewish institutions seriously.

As antisemitic hate crimes continue to rise, Jewish leaders say they're working in collaboration with law enforcement to continuously monitor potentially violent activity.

"We know invariably, when tensions flare in other parts of the world, the reverberations are felt everywhere, including in our homes and our communities," Pell said. "So we're really bracing for a very vulnerable time and thankfully, law enforcement is paying very close attention."

ABC News' Emmanuelle Saliba contributed to this report.

Disclaimer: The copyright of this article belongs to the original author. Reposting this article is solely for the purpose of information dissemination and does not constitute any investment advice. If there is any infringement, please contact us immediately. We will make corrections or deletions as necessary. Thank you.