Why are Grok and X still available in the App Stores?


Elon Musk’s AI The Grok chatbot is being used flood thousands of sexualized images of adults and apparent minors wearing minimal clothing. Some of this content not only appears to violate X’s own rules policieswhich prohibit the sharing of illegal content such as child sexual abuse material (CSAM), but may also violate Apple App Store and Google Play Store guidelines.

Both Apple and Google explicitly ban apps containing CSAM, the hosting and distribution of which is illegal in many countries. Tech giants also ban apps that contain pornographic material or facilitate harassment. Apple App Store said it does not allow “overtly sexual or pornographic material”, nor “defamatory, discriminatory or mean-spirited content”, especially if the application is “likely to humiliate, intimidate or harm a targeted individual or group”. The Google Play store prohibitions apps that “contain or promote content associated with sexually predatory behavior, or distribute non-consensual sexual content,” as well as programs that “contain or facilitate threats, harassment, or intimidation.”

Over the past two years, Apple and Google have removed a number of “nudify” and AI image generation apps after investigations by BBC and 404 Media found they were announced or used to effectively turn ordinary photos into explicit images of women without their consent.

But at the time of publication, the X application and the standalone Grok app remain available in both app stores. Apple, Google and X did not respond to requests for comment. Grok is operated by Musk’s multibillion-dollar artificial intelligence startup, xAI, which also did not respond to WIRED’s questions. In a public statement published on January 3, X said it was taking action against illegal content on its platform, including CSAM. “Anyone who uses or encourages Grok to create illegal content will face the same consequences as if they downloaded illegal content,” the company warned.

Sloan Thompson, director of training and education at EndTAB, a group that teaches organizations how to prevent the distribution of nonconsensual sexual content, says it’s “absolutely appropriate” for companies like Apple and Google to take action against X and Grok.

The amount of explicit, non-consensual images of X generated by Grok has exploded over the past two weeks. A researcher told Bloomberg that over a 24-hour period between January 5 and 6, Grok produced approximately 6,700 images per hour that they identified as “sexually suggestive or nudifying.” Another analyst collected more than 15,000 image URLs created by Grok on X during a two-hour period on December 31. examined about a third of the images and found that many of them featured women in revealing clothing. More than 2,500 were marked as no longer available after a week, while almost 500 were labeled as “age-restricted adult content.”

Earlier this week, a spokesperson for the European Commission, the governing body of the European Union, publicly condemned sexually explicit and non-consensual images generated by Grok on X as “illegal” and “appalling”, telling Reuters that such content “has no place in Europe”.

On Thursday, the EU ordered to retain all internal documents and data relating to Grok until the end of 2026, extending a previous retention directive, to ensure that authorities can access relevant documents for compliance with the EU Digital Services Act, although a new formal investigation has not yet been announced. Regulators in other countriesincluding the United Kingdom, India and Malaysia, also declared they are investigating the social media platform.



Source link

اترك ردّاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *