EU Slams Elon Musk's X Over Content Moderation and Deepfake Risks | TechCrunch - Latest Global News

EU Slams Elon Musk’s X Over Content Moderation and Deepfake Risks | TechCrunch

The European Union has confirmed the investigation into the social network, which opened in December. Violations of the regime could be costly for Musk, as law enforcement authorities have the power to impose fines of up to 6% of annual global sales.

On Wednesday, the Commission announced that it had sent a formal request for information (RFI) to Access for researchers will be examined.

The RFI also addresses some new concerns: The EU says it is asking X about its content moderation activities and resources in light of its latest transparency report (another DSA requirement) – which shows that has reduced one-fifth (20%) since the last report in October 2023.

The report also revealed that to take action against content damage.

Another new EU concern concerns X’s approach to generative AI. The Commission called for more details on “risk assessments and mitigation measures related to the impact of generative AI tools on electoral processes, the spread of illegal content and the protection of fundamental rights.”

X is regulated as a so-called very large online platform (VLOP) under the DSA, meaning it is subject to an additional layer of rules – overseen by the Commission itself – that require it to address systemic risks, for example in areas such as disinformation, to evaluate and mitigate.

“The request for information sent today is a further step in an ongoing investigation,” the EU said in a press release. “It builds on the evidence collection and analysis carried out to date, including in relation to X’s transparency report published in March 2024 and responses from

Back in March, the Commission sent a flurry of RFIs to several VLOPs, including The EU is concerned about the role that political deepfakes could play in the upcoming European Parliament elections next month.

The latest RFI to X gives the platform until May 17 to provide answers to its questions about content moderation resources and generative AI. The other requested information must be received by the Commission by May 27.

X was contacted for a response to the development but had not commented as of press time.

During a briefing with journalists last month, a senior commission official declined to give a full overview of its investigation with X but described contacts with the company as “quite intense.”

The official also confirmed that an active topic of discussion is on the Community Notes feature for election risk response.

Sharing Is Caring:

Leave a Comment