WWE wrestler Jordynne Grace has called for immediate legislation after becoming a target of a trend involving non-consensual deepfake generation on the X platform.
Grace responded to a user on X (formerly Twitter) who had employed the platform’s built-in ‘Grok’ AI tool to digitally modify a photograph of her, replacing her clothing with ‘colorless’ attire.
In a post on December 4, Grace stated, “There needs to be laws against this put into place immediately.”
The account responsible deleted the post following Grace’s response, though the user’s “replies” tab shows that he has targeted multiple women on the platform in a similar manner.
The incident highlights a wider surge in users employing the Elon Musk-owned AI tool to generate sexualized images of women and minors. A recent report by Reuters details how users are bypassing safeguards to “strip” subjects in photos down to bikinis, underwear, or less.
According to the Reuters analysis, the trend has affected numerous users, including many who have no public profile. One case involved a musician whose photo of her snuggling in bed with a cat was manipulated into a bikini image without her consent. The report also identified instances of the tool being used to generate sexualized images of children.
While X owner Elon Musk has appeared to minimize the controversy with social media posts featuring “laugh-cry” emojis, international regulators have expressed alarm. Ministers in France have reported the content to prosecutors as “manifestly illegal,” while India’s IT ministry stated the platform has failed to prevent the circulation of obscene content.
Experts had previously cautioned xAI about the potential for abuse. Tyler Johnston of The Midas Project told Reuters, “In August, we warned that xAI’s image generation was essentially a nudification tool waiting to be weaponized. That’s basically what’s played out.”
