Linda Yaccarino responds to EU: 700 Community Notes, 5K+ images shared on Israel-Hamas war, “thousands” of pieces of content removed
[ad_1]
X, the social platform formerly known as Twitter, came under fire earlier this week in Europe, when European Commissioner Thierry Breton sent a stark open letter to the company warning it of its failure to clamp down on disinformation and illegal content on the platform circulating in the aftermath of the deadly Hamas terrorist attack on Israel. Today, X responded with a letter long on pages, but relatively short on numbers and direct acknowledgement of its stumbles.
A letter signed by X CEO Linda Yaccarino notes that company has “redistributed resources” and “refocused teams”. The letter stays, in Yaccarino’s words, “high level”, which means that it is light on specific numbers. “Shortly after” the attack (no exact timing), a leadership group was assembled to consider X’s response; “tens of thousands” of pieces of content have been removed, and user-generated Community Notes are now on “thousands” of posts, and “hundreds” of accounts linked to terrorist groups or violence or extremism have been removed. She does not give any estimate of how much content it’s facing overall that still needs moderating and checking.
She added that X is responding to law enforcement requests, but also said the company had not received any requests from Europol at the time of writing.
Significantly, however, the letter does not acknowledge or address any of what many users had been seeing in plain sight on the platform since Saturday, which included graphic videos of the terrorist attacks on civilians, as well as posts allegedly showing footage from the attacks in Israel and Gaza that had already been identified as false.
Nor does she acknowledge that Elon Musk himself, the owner of X and arguably the most popular user of the platform these days, shared a recommendation to follow an account known for spreading antisemitic content.
It appears that post is down, but just do a search and the terms he used and you can find many, many shares of a screenshot, which highlights the slippery problem X and other social media companies have here. Many others have reported on the mess on the platform — Wired described X as “drowning in disinformation” — with those reports likely being a major spur to the EU sending its letter out in the first place.
The response comes in the wake of Breton sending a similar letter to Meta yesterday. Meta told TechCrunch that it too had assembled a team to respond and was actively engaging in trying to keep harmful content off the platform. It’s likely writing a similar letter to X’s directly to the Commissioner.
The bulk of X’s four-page letter takes the EU through X’s existing policies in areas like its basic rules, public interest exceptions, and its policy on removing illegal content.
But with a lot of the company’s staff depleted in areas like content moderation and trust & safety, Community Notes have taken on a very prominent role for policing what gets said on the platform and that’s where Yaccarino gets a little more specific — but only a little.
She noted that posts more than 700 Community Notes related to the attacks are being seen, out of tens of millions of posts with Community Notes being viewed overall in the last four days (but that number covers all subjects, not just Israel). It’s unclear whether that is to message that Israel-Hamas content is relatively small, or to note how much activity there has been.
She also noted that more than 5,000 posts have matching video and other media as a result of its “notes on media” feature, and this number grows when those posts get shared. Notes are being posted about five hours after they are created, the letter said, and the company is working on speeding that up. (Notes attached to media are getting approved faster, she added.
Breton’s letter is an early example of how the EU is likely to implement its newly minted content moderation policies, which are part of its new Digital Services rulebook and have special requirements of very large online platforms — which, despite the exodus of users since it rebranded from Twitter, still includes X. As Natasha has noted previously, disinformation is not illegal in the EU, but X has a legal obligation now to mitigate risks attached to fake news, which includes making a swift response to when illegal content is reported.
[ad_2]
Source link
Comments are closed.