Musk's child's mother sues xAI over pornographic deepfakes created by Grok chatbot
Kyiv • UNN
Ashley St. Clair, the mother of one of Elon Musk's children, has filed a lawsuit against xAI over deepfakes created by the Grok chatbot. She claims that the neural network generated sexually exploitative images featuring her face.

Ashley St. Clair, 27, a writer and mother of one of billionaire Elon Musk's children, has filed a lawsuit against his artificial intelligence company xAI. The woman claims that the Grok chatbot allowed users to create sexually exploitative deepfakes of her face, which caused her severe emotional distress. This is reported by AP, writes UNN.
Details
In the lawsuit filed in New York, St. Clair describes shocking examples of generated images. In particular, it refers to her childhood photo (at the age of 14), which the neural network altered by "dressing" the girl in a bikini. Adult deepfakes in sexualized poses and images in bikinis with swastikas are also mentioned, which is particularly offensive, as the plaintiff is of Jewish descent.
Elon Musk called the UK government "fascist" over the threat to block X11.01.26, 00:47 • 10891 view
I have suffered and continue to suffer severe pain and mental distress as a result of xAI's role in the creation and dissemination of these digitally altered images of me. I am humiliated and feel that this nightmare will never end as long as Grok continues to create these images of me.
Accusations of retaliation from the X platform
St. Clair reported that she repeatedly contacted the X social network with a request to remove the content, but initially received a response that the images did not violate the platform's policy.
Later, according to her, the company resorted to "retaliation": St. Clair's account with a million followers was deprived of its premium subscription and verification badge, which blocked the possibility of monetization. At the same time, humiliating fakes continued to spread.
xAI's reaction and Grok's limitations
Amidst a global scandal over the creation of sexualized images of women and children, the X platform announced the introduction of restrictions. Now, Grok allegedly cannot edit photos of real people, depicting them in revealing clothing. However, for St. Clair, who claims to live in constant fear due to the actions of the neural network's users, these measures are too late. xAI representatives are currently refraining from official comments regarding the lawsuit.
Grok disabled image generation for most X social network users09.01.26, 21:14 • 6643 views