EU orders X to preserve Grok files

Brussels has directed European Union regulators to compel X to retain internal records linked to its Grok chatbot through the end of 2026, widening a probe into allegations that the tool surfaced antisemitic material and generated non-consensual sexual content. The preservation order, issued under the bloc’s digital rules, is designed to prevent the loss or alteration of documents while investigators assess whether safeguards were adequate and whether […] The article EU orders X to preserve Grok files appeared first on Arabian Post.

Brussels has directed European Union regulators to compel X to retain internal records linked to its Grok chatbot through the end of 2026, widening a probe into allegations that the tool surfaced antisemitic material and generated non-consensual sexual content. The preservation order, issued under the bloc’s digital rules, is designed to prevent the loss or alteration of documents while investigators assess whether safeguards were adequate and whether platform obligations were breached.

The decision intensifies scrutiny of Grok, an artificial-intelligence assistant integrated into X and developed by xAI, the startup founded by Elon Musk. Officials say the scope of materials to be preserved includes system prompts, training and fine-tuning records, model updates, risk assessments, internal communications on moderation policies, incident reports, and data reflecting how the tool responded to user prompts flagged by civil society groups.

At issue are complaints that Grok produced antisemitic tropes and explicit imagery involving real individuals without consent. Investigators are examining whether content filters, red-teaming practices and human oversight were sufficient, and whether the platform moved quickly to mitigate harms once problems were identified. The order does not itself establish wrongdoing but signals that authorities consider the evidence trail significant enough to secure for an extended period.

The move sits within the EU’s expanding enforcement of its digital framework, which obliges large platforms to assess and reduce systemic risks, maintain audit trails and cooperate with regulators. Preservation directives are commonly used when there is a risk that logs or design documents could be deleted during fast-moving product iterations, particularly for generative AI systems that change frequently through updates.

X has previously said it is committed to complying with applicable laws and improving safety features. The company has also argued that Grok was designed to answer questions candidly and that guardrails have been strengthened after early shortcomings. Since the complaints emerged, X and xAI have announced tweaks to filters and moderation workflows, though regulators are assessing whether those steps meet legal standards.

The investigation reflects broader unease in Europe over generative AI deployed at scale on social platforms. Lawmakers and regulators have pressed companies to demonstrate that models are trained and operated responsibly, with mechanisms to prevent hate speech, harassment and sexual exploitation. Preservation of records allows authorities to reconstruct decision-making, evaluate model behaviour over time and determine whether risk assessments matched observed outcomes.

Industry experts note that keeping materials through 2026 is an unusually long horizon, underscoring the complexity of AI audits and the likelihood that enforcement will extend beyond a single incident. The directive also creates obligations for corporate governance, as teams must ensure that engineers, product managers and legal staff do not purge or overwrite relevant data during routine maintenance or upgrades.

Civil rights organisations welcomed the step as a safeguard for accountability, arguing that without preserved evidence it is difficult to verify claims about fixes or to understand how harmful outputs occurred. They contend that non-consensual sexual content generated by AI poses acute risks to privacy and dignity, while antisemitic outputs can amplify hate in already polarised online spaces.

For X, the probe arrives as the company seeks to position Grok as a distinctive AI assistant and expand its capabilities across the platform. Compliance costs could rise as firms devote resources to documentation, audits and cooperation with regulators. Penalties, if imposed later, could include fines or remedial orders requiring changes to product design and moderation practices.

The article EU orders X to preserve Grok files appeared first on Arabian Post.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Economist Admin Admin managing news updates, RSS feed curation, and PR content publishing. Focused on timely, accurate, and impactful information delivery.