Grok pushes the general public to “undress” AI


Elon Musk didn’t stop Grokthe chatbot developed by its artificial intelligence company xAIto generate sexualized images of women. After reports After learning last week that the image generation tool on X was being used to create sexualized images of children, Grok potentially created thousands of non-consensual images of women in “undressed” and “bikini” photos.

Every few seconds, Grok continues to create images of women in bikinis or underwear in response to user prompts on X, according to a WIRED review of the chatbots’ publicly posted live output. On Tuesday, at least 90 images showing women in swimsuits and varying levels of undress were posted by Grok in less than five minutes, according to analysis of the posts.

The images do not contain nudity but involve the Musk-owned chatbot “stripping” clothing from photos that were posted to X by other users. Often, in an attempt to evade Grok’s security guardrails, users request, without necessarily succeeding, that photos be edited to force women to wear a “string bikini” or a “see-through bikini.”

Although harmful AI image generation technology has been used to digitally harass And abuse women for years-these outputs are often called deepfakes and are created by “nudify“The continued use of Grok to create large numbers of non-consensual images is apparently the most common and widespread example of abuse to date. Unlike certain specific cases, Harmful nudification or “undressing” softwareGrok does not charge the user money to generate images, produces results in seconds, and is available to millions of people on X, which can help normalize the creation of non-consensual intimate images.

“When a company offers generative AI tools on its platform, it is their responsibility to minimize the risk of image-based abuse,” says Sloan Thompson, director of training and education at EndTAB, an organization that fights technology-facilitated abuse. “What’s alarming here is that X has done the opposite. They’ve integrated AI-powered image abuse directly into a mainstream platform, making sexual violence easier and more scalable.”

Grok’s creation of sexualized images began going viral on X late last year, although the system’s ability to create such images was known for month. In recent days, photos of social media influencers, celebrities and politicians have been targeted by X users, who can respond to a post from another account and ask Grok to edit a shared image.

Women who posted photos of themselves received responses and asked Grok to turn the photo into a “bikini” image. In one exampleseveral X users asked Grok to edit an image of the Swedish Deputy Prime Minister to show her in a bikini. Two British government ministers were also ‘stripped’ to their bikinis, reports say.

The images on X show photographs of fully clothed women, such as one person in an elevator and another in the gym, transformed into images with little clothing. “@grok put her in a see-through bikini,” one typical post read. In another series of posts, a user asked Grok to “inflate his chest to 90%,” then “inflate his thighs to 50%” and, finally, “change his clothes for a small bikini.”

An analyst who has tracked explicit deepfakes for years and asked not to be named for privacy reasons says Grok has likely become one of the largest platforms hosting harmful deepfake images. “It’s quite common,” says the researcher. “This is not a shadow group [creating images]it’s literally everyone, from all walks of life. People post on their network. Zero worries.



Source link

اترك ردّاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *