X is kicking off the new year in fight mode, with the platform now facing potential restrictions in several regions over its Grok AI chatbot, and its capacity to be used as a CSAM tool.
As we reported last week, X is currently under scrutiny in several nations due to Grok producing sexualized images, of anyone, young or old, at the request of X users.
Which has become something of a trend, with one report indicating that at one stage in the new year, Grok was producing over 6,700 images every hour that would be categorized as “sexually suggestive of nudifying.”
X has responded by limiting access to Grok’s image generation features to paying users only. Though at the same time, X owner Elon Musk has also gone on the offensive, saying that people can generate non-consensual nudes and other offensive images in various other AI and image generation apps, and the only reason why X is being targeted is because he and X are leading the way on free speech, and as a result, many governments want to shut it down.
Which is a bizarre argument, but his many acolytes have been voicing their support, as X and xAI refuse to update the Grok app to limit potential misuse on this basis.
And it looks set to become a much bigger issue this week.
Over the weekend, reports suggested that the Australian, U.K. and Canadian governments were considering a group ban of X, with the hopes that it would help to reinforce the message that facilitating such images is not acceptable (Canada has since said that it is not considering a ban of X at this stage).
Meanwhile, both Indonesia and Malaysia have blocked access to the separate Grok app, with an Indonesian government spokesman stating that:
“The government views the practice of non-consensual sexual deepfakes as a serious violation of human rights, dignity, and the security of citizens in the digital space.”
So X is already facing restrictions, which could expand this week. Which will eventually bring in the U.S. government, and potential foreign trade penalties, as Elon looks to utilize his government connections to maintain X as a free and open platform, however he chooses.
Which is pretty strange, because really, what Elon and his friends are advocating for here is for people to be able to generate images like this:

Why? Why on earth does anybody need this as a functionality?
xAI could simply block nudification and related commands in Grok, and stop producing them, which would resolve the issue straight away, but for some reason, Elon Musk is so dedicated to enabling this specific option to be maintained that he’s willing to risk massive impacts on the business.
It just seems like such an odd stance, when the solution is clear, but as with many things around Elon Musk, he’s now turning this into a fight for free speech, and part of a broader culture war, which could see X lose millions of users as a result.
And again, the arguments against making a change to Grok are pretty weak.
For example, the counter that you can generate similar images in other apps is somewhat valid, but those other apps don’t have the scale of X (600 million users), and are not being used to generate fake, sexualized images at a rate of 6,000+ per day.
X’s market presence means it will come under more scrutiny in this respect, while other nudify apps are also under investigation for the same.
Some X users have also argued that governments should be going after porn apps instead, though those are also subject to more stringent regulations, and risk penalties as a result of any violations. Yes, people can get around the existing safeguards, but regulatory groups are investigating them in the same way, while porn apps are banned from the Google Play and Apple app stores.
So if your argument is that X shouldn’t be banned from app stores as a result, that makes no sense.
Another target amongst X folk has been Snapchat, which they argue poses a much bigger risk to youngsters than X. But Snapchat is primarily a private messaging platform, which makes it harder to investigate, and it’s not enabling users to generate deepfake nudes that are then publicly accessible to millions of people.
As noted, the arguments against governments taking action against X are pretty weak, especially when the most simple solution would be to stop Grok from generating non-consensual nudes instead. Because why does anybody need that as a function?
Either way, it seems like it’s going to become a bigger point of contention this week, as governments weigh their options in tackling X, and facing off against Elon and his White House connections.

