
Grok, the chatbot Elon Musk’s 2026 has begun wrapped in new controversy. This artificial intelligence is known for its sarcastic responses, for spreading false information and undress women in photos.
This latest bad practice is not the first time that it has occurred since, in May 2025, Kolin Koltai, researcher for the media Bellingcat, discovered that users were asking Grok to remove the clothes of women appearing in a photo, proposing In response, an image of the girls in lingerie and bikinis. Furthermore, sometimes the chatbot He responded with a link to a chat containing the explicit image.
Not far from what happened last year, Since 20bits we saw that Grok was still doing the same thing. On this occasion, users ask the chatbot removing the clothes of the girls who appear in the photo, thus showing the women in bikinis, lingerie and even naked. However, while it is true that on some occasions Grok asks for age verification to see the result, on other occasions the photo appears as is, as can be seen in the screenshots.
On the other hand, 20bits was able to verify that some users ask the AI to undress girls, but others see it as bad practice because people’s rights are violated.
Why is Grok able to undress women?
Clare McGlynn, professor of law at the University of Durham (England), explains to the newspaper Coindents that Grok does not undress real people, but creates completely fictitious images through generative AI models trained to recognize human body patterns in order to recreate them artificially, as happened with Taylor Swift.
However, as we mentioned previously, the debate goes beyond the technical framework and focuses on the ethical and legal implications of this type of use. The generation of non-consensual nudes, even fake ones, can violate privacy and become a form of digital harassmentthis is why many AI platforms explicitly prohibit these practices. The case of Grok therefore revives the discussion on the limits that should be imposed on artificial intelligence and the responsibility of technology companies when it comes to avoiding abuse.