Dark Mode
More forecasts: Johannesburg 14 days weather
  • Tuesday, 17 March 2026

Teenagers Sue Elon Musk's xAI After Grok Used to Create Child Sexual Abuse Images

Teenagers Sue Elon Musk's xAI After Grok Used to Create Child Sexual Abuse Images

Three Tennessee teenagers have filed a federal lawsuit against Elon Musk's AI company xAI, alleging that its Grok chatbot was used to create child sexual abuse material from ordinary photos of them, including school yearbook pictures and social media images, without their knowledge.

 

The lawsuit, which was filed on Monday in the Northern District of California and is seeking class-action status, is the first lawsuit brought by alleged minor victims of the deepfake scandal that has surrounded xAI for months. Two of the three plaintiffs are still under 18. All three are withholding their names to protect their privacy. 

 

The case centres on Grok Imagine, a feature xAI released with what it called a "spicy" mode, which allowed users to generate and edit sexualised images of real people. According to the complaint, a single perpetrator used the tool to compile photos and videos of more than 18 girls, many of whom were from the same school, and digitally altered them into explicit content. The material then circulated on Discord and Telegram, where some images were traded for other child sexual abuse material in online chatrooms.

 

The first plaintiff, identified as Jane Doe 1, found out what had happened after receiving an anonymous Instagram message directing her to a Discord server containing explicit images of herself and at least 18 other girls she recognised from school. Among the images that were altered were her homecoming photo from September and what appeared to be her yearbook picture. A second image depicted her topless. The perpetrator was arrested in December following a police investigation, but the images had already spread across the internet. The two remaining plaintiffs learned through the criminal investigation in February that their images had been used in the same way.

 

The lawsuit argues that xAI didn't just fail to prevent the abuse, but that it actively created the conditions for it. "xAI — and its founder Elon Musk — saw a business opportunity," the complaint states. "They knew Grok could produce such results, including by using the images and videos of children, and publicly released it anyway." Attorneys allege that the undressing feature was deployed specifically to drive user growth and monetise the platform, and that "a model that can create sexualized images of adults cannot be prevented from creating CSAM of minors."

 

Plaintiffs' counsel Annika Martin said: "These are children whose school photographs and family pictures were turned into child sexual abuse material. Elon Musk and xAI deliberately designed Grok to produce sexually explicit content for financial gain, with no regard for the children and adults who would be harmed."

 

Fellow attorney Vanessa Baehr-Jones added: "These young people — these children — are facing a lifetime of having these … sexualized images of what appears to be a child's body out there on the internet. It wouldn't have been possible but for this tool that xAI released knowing full well that this material could be generated."

 

The human cost has been significant. The mother of one plaintiff, speaking anonymously to protect her daughter, said the incident "crushed" her child, who was previously a social and outgoing student-athlete. "It definitely put her into a little bit of a shell, which we had never seen before," she said. Lawyers warn that the consequences of these images being created will likely follow the girls for decades, with the plaintiffs expected to receive National Center for Missing and Exploited Children notifications for the rest of their lives, which will alert them whenever their images surface in criminal cases.

 

Researchers estimated that Grok generated around 23,000 images appearing to depict children over just an 11-day period after the tool launched. The scale of the problem prompted investigations from the California attorney general, the European Commission and UK regulator Ofcom. By mid-January, xAI said it had rolled back its editing tools in some jurisdictions.

 

Musk has largely deflected responsibility. In January he wrote that he was "not aware of any naked underage images generated by Grok. Literally zero," placing blame on users rather than the platform. He added that Grok "does not spontaneously generate images, it does so only according to user requests" and would refuse anything illegal, saying that bugs were fixed "immediately." Last week he posted that "if it's allowed in an R-rated movie, it's allowed" by Grok's image tools.

 

The plaintiffs are seeking unspecified damages and an immediate injunction requiring xAI to stop enabling the creation of such images. The suit alleges violations including child pornography laws, product design defects and the creation of a public nuisance.

Comment / Reply From