Earlier this week, European Commissioner Thierry Breton sent a stark open letter to X, formerly Twitter, criticizing its failure to curb disinformation and illegal content on the platform after the Hamas terrorist attack on Israel. Today, X responded with a lengthy letter that was short on numbers and directly acknowledged its mistakes.
X CEO Linda Yaccarino wrote that the company “redistributed resources” and “refocused teams”. According to Yaccarino, the letter remains “high level”—light on specific numbers. A leadership group was formed to consider X’s response “shortly after” the attack (no exact timing). “Tens of thousands” of pieces of content have been removed, User-generated Community Notes are on “thousands” of posts, and “hundreds” of accounts linked to terrorist groups or violence or extremism have been removed. She doesn’t estimate how much content needs moderation and checking.
X is responding to law enforcement requests, but Europol has not requested anything yet, she said.
However, the letter does not address any of what many users had been seeing on the platform since Saturday, including graphic videos of terrorist attacks on civilians and posts allegedly showing footage from Israel and Gaza attacks that had already been proven false.
She doesn’t mention that Elon Musk, X’s owner and arguably the platform’s most popular user, recommended following an antisemitic account.
X and other social media companies have a slippery problem here, as that post appears to be down, but a search for the terms he used yields many screenshot shares. Wired called X “drowning in disinformation” and many others have reported on the platform’s mess, which likely prompted the EU’s letter.
The response follows Breton’s similar letter to Meta yesterday. Meta told that it had a team to respond and was actively removing harmful content. A similar letter to X’s to the Commissioner is likely.
The majority of X’s four-page letter walks the EU through its basic rules, public interest exceptions, and illegal content removal policies.
However, with many of the company’s staff depleted in content moderation and trust & safety, Community Notes have taken on a prominent role in policing platform content. Yaccarino gets a little more specific—but only a little—about that.
She noted that over 700 Community Notes related to the attacks have been viewed out of tens of millions of Community Notes posts in the last four days (encompassing all subjects). It’s unclear if that’s to convey that Israel-Hamas content is low or to highlight activity.
She also noted that its “notes on media” feature matches video and other media in more than 5,000 posts, and this number grows when shared. Letter said company is working to speed up note posting, which is five hours after creation. She added that media notes are approved faster.
Breton’s letter is an early example of how the EU will implement its new content moderation policies, which are part of its new Digital Services rulebook and have special requirements for very large online platforms like X, which have lost users since rebranding from Twitter. Natasha has noted that disinformation is not illegal in the EU, but X must now mitigate fake news risks by responding quickly to illegal content reports.