Elon Musk's Grok AI massively creates sexualized deepfakes of X users - Reuters
Kyiv • UNN
Grok, integrated into X, generates sexualized images of real people, including minors. xAI denies the accusations, and Elon Musk responds with a laughing-crying emoji.

The Grok chatbot, integrated into the social network X (formerly Twitter), found itself at the center of a scandal due to its ability to generate sexualized images of real people. An analysis by Reuters confirmed numerous instances of digital "undressing" of women and the creation of indecent content involving minors. This is reported by UNN.
Details
X users are using Grok to edit other people's photos, sending requests to replace clothes with bikinis or transparent outfits. Unlike third-party "nudification" services, Grok allows this to be done instantly directly within the platform.
Elon Musk launches nationwide AI-based education program in El Salvador14.12.25, 01:06 • 4887 views
Journalists recorded over 100 attempts to edit photos in just one 10-minute period. In many cases, the bot fulfilled requests, creating explicit images of women, and also ignoring signs that children might be in the photos (in particular, requests for "school uniforms").
Elon Musk and xAI's Reaction
xAI called the media's findings "lies by outdated media." Platform owner Elon Musk, on the other hand, reacted to the situation with a laughing-crying emoji under posts with AI-edited famous people and user complaints about the prevalence of nude content in the feed.
International pressure and legal consequences
The events sparked a wave of condemnation from governments of various countries. In France, parliamentarians informed prosecutors about "clearly illegal" content on X.
Elon Musk's xAI chatbot Grok overestimates its creator's abilities21.11.25, 11:50 • 3288 views
India's Ministry of IT sent an official demand to the platform to stop the spread of indecent deepfakes.
Experts from The Midas Project stated that they had warned xAI in August about their tool becoming a "weapon for sexual exploitation," but the company ignored the warnings.