Uncategorized

Women and children are being digitally undressed on social media

The social media platform X has been flooded with AI-generated sexualized images of women and children in recent weeks — months before a new law aiming to ban the spread of nonconsensual intimate imagery goes into effect.

Elon Musk, the owner of X, announced a new feature with a post on Christmas Eve encouraging users to try editing images and videos with Grok, the app’s chatbot. Then, a few weeks later, as the new year began, many women noticed something disturbing online: a flood of AI-generated sexualized images of them on the social media platform X. Users on X were able to ask Grok’s latest feature to digitally remove clothing from posted photos and recirculate them. 

Grok won’t show people naked, but it follows directions to show women wearing strings and dental floss. Some X users asked Grok to manipulate photos with prompts like “put her into a very transparent mini-bikini,” “remove her school outfit” and “spread her legs.” 

Musk appeared to make light of the situation when he posted laughing emojis in response to AI edits of famous people, including himself, in bikinis. But Musk might have to take the problem more seriously soon — according to federal law.

In May 2025, President Donald Trump signed the bipartisan Take It Down Act, which criminalizes the distribution of nonconsensual intimate imagery, including AI-generated deepfakes and so-called “revenge porn.” Platforms have until May 2026 to implement a request-and-removal system where victims can have their pictures taken down within 48 hours.

On Saturday, Musk warned X users in a post: “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.” 

The question is: Which of these Grok-generated images are considered illegal? 

The law defines “intimate visual depiction” as any that show uncovered genitals, pubic area, anus or women’s nipples. 

Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered AI, said she thinks many of these skimpy bikini photos could be considered illegal. She urged people to submit requests for removal because the law incentivizes media companies to take down questionable content to avoid penalties from the Federal Trade Commission.

Pfefferkorn said it might take more public pressure to end this latest Grok feature. If X was serious about not allowing nonconsensual imagery, Pfefferkorn said, the company would have taken the tool offline weeks ago. 

“There was no reason why they couldn’t just disable that feature,” Pfefferkorn said. “Any time any other company has a feature that goes terribly, badly wrong, they can just stop it and then try and lick their wounds and figure out what happened. But Elon Musk is the richest man in the world, and he has acted repeatedly as though he is above the law and has gotten away with it.”

It’s not clear how many of these photos were circulated in what Reuters called a “mass digital undressing spree,” but the news outlet’s review of public requests sent to Grok found at least 102 attempts in a 10-minute period. 

The flood of nearly nude images — predominantly of young women and in some cases children – sparked international fury: Ministers in France have reported X to regulators; the United Kingdom’s communication regulator reached out to the media company; and India’s Ministry of Electronics and Information Technology wrote a condemning letter.

In allowing these images, X is breaking with AI competitors like OpenAI and Google, which have stricter rules about what their AI chatbots will generate. 

Leave a Reply

Your email address will not be published. Required fields are marked *