Musk's Grok AI generated fully pornographic videos, research shows
What's the story
Elon Musk's artificial intelligence (AI) tool, Grok, has been exploited to generate sexually explicit and violent video content featuring women, a recent study has revealed. The research was conducted by AI Forensics, a Paris-based non-profit organization. It discovered nearly 800 images and videos created by the Grok Imagine app that contained pornographic material. Paul Bouchaud from AI Forensics said these were "fully pornographic videos" of professional quality. Following global outcry, X has limited Grok's AI image generator to paid users.
Disturbing misuse
Grok's role in undressing image of ICE agent's victim
Grok was also misused to undress an image of Renee Nicole Good, the woman killed by an Immigration and Customs Enforcement (ICE) agent in the US. The tool was even used to show her with a bullet wound on her forehead. This incident highlights the disturbing potential of AI tools like Grok when they fall into the wrong hands.
Content analysis
Grok's content significantly more explicit than previous trends
AI Forensics was able to retrieve the pornographic images because users created a "sharing link," which allowed them to be captured by the Wayback Machine, an internet archive. The study found that over half of the images were of people in "minimal attire," mostly women under 30. Bouchaud noted that this content is much more explicit than the bikini trend previously seen on X, the social media platform owned by Musk's tech company xAI.
Political response
UK Prime Minister condemns AI-generated explicit content
UK PM Keir Starmer has condemned the flood of AI-generated photos of partially clothed women and children on X. He described the content as "disgraceful" and "disgusting." Speaking to Greatest Hits Radio, Starmer hinted that X could be banned in the UK under the Online Safety Act if it doesn't take action against this explicit material. He said Ofcom "has our full support to take action in relation to this."
Advocacy response
Women's rights campaigners demand urgent action against AI misuse
Women's rights activists have criticized the UK government for its delayed response to this growing issue. Penny East, CEO of the Fawcett Society, UK's leading women's rights charity, called for immediate action from the government. She said "The increasingly violent and disturbing use of Grok illustrates the huge risks of AI without sufficient safeguards." East also stressed that greater regulation is urgently needed to prevent such abuses of AI tools like Grok.