X platform ignoring antisemitic and Islamophobic hate – Report

Farhan Shabir
By Farhan Shabir 5 Min Read

X is failing to moderate hate speech on its platform that promotes antisemitic conspiracies, praises Hitler and dehumanizes Muslims and Palestinians.

In new exploration, the Center for Fighting Digital Detest ( CCDH), a nonprofit that researches online hate and unreasonableness, collected a sample of 200 X posts across 101 accounts that featured hate speech. Each post was reported on the platform on October 31 using X’s reporting tools and either “ directly addressed the ongoing conflict, or appeared to be informed by it. ”

That tool invites druggies to flag content and give information on what order of gender it falls into, including an option for hate speech. That reporting options include “ Slurs, Racist or sexist conceptions, Dehumanization, Incitement of fear or demarcation, Hateful references, spiteful symbols & ensigns. ”

According to the CCDH, 196 of the 200 posts remain online, while one account was suspended after being reported and two were “ locked. ” A sample of the posts reviewed by TechCrunch shows that X continued to host content that depicted antisemitic caricatures, called Palestinians “ creatures ” and invited others to “ enjoy the show of Jews and Muslims killing each other. ”


View counts on the X posts varied, but some were viewed over 100,000 times, including posts denying the Holocaust, and one interactive gif depicting a man in a yarmulke being choked, which was viewed nearly one million times. The posts that weren’t removed collected more than 24 million views in aggregate.

While a sample of 200 posts only represents a bit of the content on X at any given time, numerous of the posts are notable for their obvious racism, open grasp of violence and for the fact that they remain online, indeed now. Social media companies regularly fail to remove swaths of content that violates their rules, but they generally remove those posts veritably snappily when experimenters or intelligencers punctuate them.

Of the sample posts included in the CCDH report, some are now fixed with a marker that says “ Visibility limited this Post may violate X’s rules against Hateful Conduct. ” Other content, including posts promoting antisemitic conspiracies, jokingly dismissing the Holocaust and using dehumanizing language to homogenize violence against Muslims remain online without a marker


“ X has sought to assure advertisers and the public that they’ve a handle on hate speech – but our exploration indicates that these are nothing but empty words, ” Center for fighting Digital detest CEO Imran Ahmed said. “ Our ‘ riddle paperback ’ test of X’s content temperance systems

– to see whether they have the capacity or will to take down 200 cases of clear, unequivocal hate speech – reveals that hate actors appear to have free rein to post virulently antisemitic and spiteful rhetoric on Elon Musk’s platform. ”

In its safety guidelines, X states that druggies “ may not attack other people on the base of race, race, public origin, estate, sexual exposure, gender, gender identity, religious cooperation, age, disability, or serious complaint. ” Under Elon Musk’s leadership, the company formerly known as Twitter has reduced its content temperance pool, rolled back safety programs guarding marginalized groups, and invited swells of preliminarily banned druggies back to the platform.

This time, X filed an action against the CCDH, professing that the nonprofit used data on the platform without authorization and designedly undermined the company’s advertising business. The CCDH maintains that X is applying legal pitfalls to silence its exploration, which has been regarded heavily in a number of reports on X’s lax content temperance under Elon Musk.

The same day that the CCDH released its new report, X published a blog post touting its content temperance systems during the ongoing conflict in Israel and Gaza. The company says that it has taken action on over 325,000 pieces of content that violate its Terms of Service and those conduct can include “ confining the reach of a post, removing the post or account suspense. ”

“ In times of query similar as the Israel- Hamas conflict, our responsibility to cover the public discussion is magnified, ”

X’s Safety platoon wrote.

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *