Despite regularly promoting and engaging with antisemitic conspiracy theories on his platform X (formerly known as Twitter), Elon Musk would like you to know that he doesn’t hate Jews, thank you very much. On Monday, he kicked off his I <3 Jewish People tour with a visit to the Auschwitz-Birkenau concentration camp in Poland, where he managed to evince a look of sober introspection for the photographers present (though the fact that he had his child perched on his shoulders as if they were watching the majorettes at a Memorial Day parade somewhat undercut this expression).
Musk followed that up with an appearance at a conference in Warsaw, telling conservative commentator Ben Shapiro that according to “outside audits,” X has “the least amount of antisemitism” compared to other social media platforms. Also, he would like you to know he has Jewish friends. A lot of them! And that he is “aspirationally Jewish.” Happy almost-Tu B’shevat, everyone!
Unfortunately for Musk, there are a fair amount of Jewish people on X — many, if not most, of whom can read, and thus have a sense of how rampant antisemitism is on the platform. Indeed, one recent report from the online antisemitism tracker Cyberwell , which focused specifically on content that engaged in the “denial and distortion of the events of Oct. 7,” has determined that of all of the big-five social media platforms (X, TikTok, Instagram, Facebook, and YouTube), X was by far the most prominent purveyor of such content. According to the report, X was responsible for 47.3 percent of the 313 antisemitic posts in its data set — twice as much as Facebook, which was responsible for 22 percent of the posts analyzed in the report. Further, the report found that X had the lowest takedown rate of all the platforms, with X only removing a measly two percent of the posts flagged for violating its guidelines on violent event denialism.
The Cyberwell report didn’t track antisemitic rhetoric in general — specifically, it focused on content that downplayed or spread misinformation related to the events of Oct. 7, meaning the Hamas-led massacre that resulted in the deaths of approximately 1,200 Israeli Jews. The massacre, which has been widely reported as the most lethal mass murder of Jews since the Holocaust, prompted Israel to initiate a brutal offensive in the Gaza Strip, which has so far left more than 25,000 Palestinians dead.
The Cyberwell report did not take into account general antisemitic messaging, or specific phrases that some have argued are antisemitic; rather, it focused on three of the most common narratives associated with Oct. 7 denialism, which it characterizes as “the recycling of the same denial mechanism that was used against Jews following the Holocaust.” These narratives include refutations of reports of sexual violence committed by Hamas (which made up 38.66 percent of the analyzed posts); the conspiracy theory that Israel was itself responsible for the massacre (36.7 percent); and the false idea that Israel profited off the massacre in some way (5.11 percent).
In theory, X community guidelines prohibit “content that denies that mass murder or other mass casualty events took place, where we can verify that the event occurred.” (The platform did not respond to Rolling Stone‘s request for comment.) Yet Musk has expressed numerous times that his preferred policy is “freedom of speech, not freedom of reach,” meaning the platform’s approach to such content is to de-amplify it, relying on the community notes feature so users can fact-check information in real time. “I think at end of the day free speech wins, in that if somebody says something that is false, especially on our platform, you can then reply to it with a correction,” Musk said in the interview with Shapiro on Monday.
But in an interview with Rolling Stone, Tal-Or Cohen Montemayor, the founder and CEO of Cyberwell, says that X is failing to enforce its own policies regarding Oct. 7 denialism. Though she says the report found that X will de-amplify such content if it is posted by lower-level influencers, that does not apply to verified subscribers on the platform with large platforms. “[X has] the most amount of Oct. 7 denialism and distortion, and that’s almost by design, because they won’t remove that type of content,” she says. “But what we were very alarmed by is that they’re also not applying their own policies of labeling, and de-amplification.”
Notably, the report found that, in addition to hosting the most Oct. 7 denalist content of all the platforms, as well as removing such content the least, such posts on X got the most engagement, racking up 17.8 million views of the 25 million views amassed by all 313 posts in its data set. Montemayor attributes this both to the lack of consistent policy enforcement on X and the chaotic news environment in the days after the Hamas massacre. “Oct. 7 presents a challenge for social media platforms, because they were treating it as an unfolding an emergent situation that would they would subject to third-party fact checking,” she says. “I could almost say that was legitimate maybe in the first month or so of the war, when things were unfolding and information was coming out. But now there’s just an open campaign trying to tell millions of people that this never happened.”
Over the past few months, Musk has garnered intense criticism for using his platform to engage with antisemitic conspiracy theorists and promote their content. Most notably, in November he endorsed the Great Replacement theory, an antisemitic trope blaming Jewish people for promoting “hatred against whites,” by replying, “You have said the actual truth” to a user promoting the theory. The tweet prompted a flurry of criticism from organizations like the Anti-Defamation League (ADL), as well as an exodus of advertisers such as Apple and Disney from the platform.
In the months since, Musk has embarked on a one-man campaign to prove that he is not, in fact, antisemitic, the culmination of which was marked by his visit to Auschwitz earlier this week. He has also condemned the use of phrases that are perceived by some as antisemitic, such as the pro-Palestinian slogan “from the river to the sea,” categorizing it as a call to genocide, a move that has been applauded by some supporters of Israel but criticized heavily by those on the left, as well as free speech advocates.
To be fair, the most recent report by Cyberwell features a relatively small cross-platform data set, as well as fairly specific criteria for what constitutes antisemitic hate speech. But that also illustrates the breadth of the problem: the report represents but a small swath of the virulently antisemitic content on X, with previous analyses finding a near-1,000% increase in anti-Jewish content on the platform in the months following Oct. 7. The preponderance of vile antisemitic conspiracy theories on his website, such as those that went viral earlier this month following a local news story about Chabad Jews digging a tunnel underneath a synagogue in Brooklyn, appears to undermine many of Musk’s apparent efforts to prove his pro-Jewish bona fides.
“I think that any personal process that Elon Musk is going through with antisemitism and public statements that he makes, should be backed up with better reinforcement enforcement of his policies,” says Montemayor. “And I think that there’s also such flagrant antisemitism on all social media platforms, that ‘from the river to the sea’ is probably the least of our problems.”