Saturday, March 7, 2026

The problem of Elon Musk’s Grok “undressing” has not been solved

Share

X. Elon Musk introduced up-to-date restrictions stopping people from editing and generating photos of real people in bikinis or other “revealing clothing”. Change in policy on Wednesday night following global outrage that Grok was used to generate thousands of harmful, nonsensical images of women “undressing” and sexualized images of apparently minors in X.

While some security measures finally seem to have been put in place to generate Grok images in X, the standalone Grok app and website still appears to be able to generate “striping”-style images and pornographic content, according to numerous tests by researchers, WIRED and other journalists. Meanwhile, other users say they can no longer create photos and videos like they used to.

“We can still generate photorealistic nudity on Grok.com,” says Paul Bouchaud, principal investigator at the Paris-based nonprofit AI Forensics, who has been tracking the employ of Grok to create sexual images and has conducted multiple tests on Grok off X. “We can generate nudity in a way that Grok on X cannot.”

“I can upload a photo to Grok Imagine and ask the person to put on a bikini, and it works,” says a researcher who tested the system on a person who looks like a woman. Tests conducted by WIRED, using free Grok accounts on its website in the UK and US, successfully removed clothing from two photos of men without any apparent restrictions. In the UK’s Grok app, when asked to undress a man, the app asked a WIRED reporter for the user’s year of birth before generating a photo.

Meanwhile, other journalists Edge and investigative point Bellingcat also found that it was possible to create sexual images while in the UK, investigating Grok and X and strongly condemning the platforms for allowing users to create “strip” images.

Since the beginning of the year, Musk’s companies – including artificial intelligence company xAI, X and Grok – have come under fire for producing nonsensical intimate photos, explicit and graphic sexual videos, and sexualized photos of apparently minors. Officials in the United States, Australia, Brazil, Canada, the European Commission, France, India, Indonesia, Ireland, Malaysia and the United Kingdom everyone condemned or triggered investigation into X or Grok.

On Wednesday A The security account on Platform X has posted updates about the possibility of using Grok on a social networking site. “We have implemented technological measures to prevent the Grok account from editing images of real people wearing revealing clothing, such as bikinis,” the account wrote, adding that the policy applies to all users, both free and paid.

In a section titled “Geoblock Update,” the X account also stated: “We are now geoblocking all users’ ability to generate photos of real people in bikinis, lingerie, and similar attire through their Grok account and on Grok on X in jurisdictions where it is illegal.” In the update, the company also added that it is working to add additional safeguards and that it continues to “remove high-priority violating content, including child sexual abuse material (CSAM) and consensual nudity.”

Spokespeople for xAI, the company that makes Grok, did not immediately respond to WIRED’s request for comment. Meanwhile, an X spokesperson says it understands that geolocation blocking applies to both the app and the website.

The latest move follows a widely criticized change on January 9, when X restricted Grok’s image generation to paid “verified” subscribers. A leading women’s group described the act as “monetization of harassment.” Bouchaud, who says AI Forensics has collected a total of approximately 90,000 Grok photos since January 9, confirms that only verified accounts have been able to generate images on X – as opposed to the Grok website or app – since January 9, and that photos of women in bikinis are now rarely generated. “We observed that they seemed to have pulled the plug and disabled functionality X,” they say.

Latest Posts

More News