In the fascinating novel reality of the Internet, Teen Girls cannot find out about Reddit periods, and India artists cannot sell sorrowful Games on Itch.io, but the military contractor will make you inconsendible deep Taylor Swift cabinets for 30 USD per month.
At the beginning of Tuesday, Xai Elona Muska introduced a novel image generator and video called Grok Imagine with a “spicy” mode, whose exit ranges from suggestive gestures to nudity. Because Groko Imagine also does not have perceptible handrails against creating images of real people, it means that you can basically generate softcore pornography of anyone who is known enough so that the grok can recreate (although pragmatically it It seems that they mainly produce the most important NSFW products for women). Musk boasted that over 34 million photos were generated in one day of starting the operation. But the real assassination is to show that XAI can ignore the pressure to maintain adults from their services, while helping users create something that is widely distorted, thanks to the legal hatching and the political lever, which no other company has.
The Xai video function – which debuted around the same time as the romantic companion chatbot named Valentine – seems with one angle strikingly strange, because it is published in a period when sex (up to sex (up to sex (up to sex (up to sex (up to sex (up to sex (up to sex (up to sex (up to sex (up to sex the same word) is pushed to the internet. At the end of last month, the United Kingdom began to enforce the rules of gating age, which required X and other services to block sexual or “harmful” content for users under 18 years of age. Around the same time a group of activists called Collective Shout Successfully presses on Steam and Itch.io To destroy adult games and other media, especially ITCH.IO in order to dedicate all NSFW shipments.
Deepfake Porn of Real People is a form of non -consumer intimate images that are illegal to deliberate publishing in the United States pursuant to the Take IT Down Act, signed by President Donald Trump at the beginning of this year. In a statement published on Thursday, rape, abuse and incest of National Network (Rainn) Called the GROK function “Part of the growing problem of sexual abuse based on the image” and joked that Grok clearly “did not get a note” about the novel law.
But according to Mary Anne Franks, Professor George Washington University Law School and president Non -Profit Cyber Civil Rights Initiative (CCRI), pursuant to the Act on Take IT Down, there is “a small danger of grok”. “The criminal decision requires” publication “, which, although unfortunately not defined in the Act, suggests providing content for more than one person,” says Franks. “If the grok makes movies only to a person who uses this tool, it would not seem enough.”
The regulatory authorities did not enforce the regulations against huge companies
GROK is probably not required to remove images under the provision of the Act on the Take Down Act – despite the fact that this principle is so disturbingly wide that it threatens most services in social media. “I do not think that GROK at least is a specific GROK-NAWET tool qualifies as a” covered platform “, because the definition of the platform requires that” first of all provide a forum for content generated by users, “he says. “The content generated by AI often includes users input data, but the actual content is, as indicated by the date, generated by AI.” Designing the removal has also been designed to work by people marking content, and GROK does not publicly publicly publish photos in which other users can see them – simply makes them extremely easy to create (and almost inevitably publishing in social media) on a large scale.
Franks and CCRI he called a limited definition “Covered platform” as a problem for other reasons months ago. This is one of the few ways in which Act Take IT Down does not serve people who have affected inconspicuous intimate images, while creating a risk for internet platforms acting in good faith. It may not even stop the groc from publicizing the dirty images of real people of real people, Franks he said Spitfire News in JunePartly because there are open questions about whether Grok is a “person” affected by law.
This type of failure is a working topic in internet regulation, which allegedly combat harmful or inappropriate content; For example, the mandate of Great Britain made it difficult to launch independent forums At the same time, they are quite easy for children.
By combining this problem, especially in the USA, regulatory agencies have not imposed significant consequences for all kinds of violation of governments by powerful companies, including many Muska companies. Trump gave companies belonging to the musk almost complete administration of bad behavior, and even after formally leaving its powerful position in the government department, Musk probably maintains a huge lever in relation to regulatory agencies such as FTC. (XAI has just received a contract of up to $ 200 million with the Department of Defense.) Even if XAI violated the Act on Take IT Down, it probably would not be an investigation.
In addition to the government, there are layers of guards that dictate what is permissible on platforms, and often attract sex. For example, Apple pushed disagreement, RedditTumblr and other platforms for censorship of NSFW material with different levels of success. Steam and Itch.io again assessed the content for adults under the threat of losing relationships with payment processors and banks, which previously placed screws on platforms such as fires and pornhub.
In some cases, like Pornhub, this pressure is the result of platforms, enabling clearly harmful and illegal messages. But Apple processors and payments do not seem to maintain hard, evenly enforced rules. Their enforcement seems to be significantly dependent on the public pressure sustainable in the face of how much power has, and despite his accident with Trump, virtually no one in business has more political power than Musk. Apple and musk are repeatedly applying for Apple rules, and Apple has become mainly on things such as the structure of fees, but apparently it is withdrawn smaller problemsIn this return of ads to X after pulling them out of the platform financed by the Nazis.
Apple banned smaller applications creating artificial acts of real people. Does it exert this type of pressure on the grok, whose video service has only launched on iOS? Apple did not answer the request for comment, but do not hold your breath.
The new GROK function is harmful to people who can now easily have inconspicuous acts made of them in a large AI service, but also shows how empty the promise of “safer” internet is. Small platforms face pressure to remove the consontently registered or completely fictitious media made by people, while the company run by a billionaire can earn money on something that in some circumstances is completely illegal. If you are online in 2025, nothing about sex, including sex – which, as usual, concerns power.
