Monday, March 9, 2026

Anyone can try to edit Grokipedia 0.2, but Grok rules

Share

Elon Musk sees Grokipedia — generated by Wikipedia’s anti-wake-up xAI artificial intelligence — as the ultimate monument to human knowledge, something so complete and true that carve in stone AND keep in space. In fact, it’s a terrible mess, and now that anyone can suggest changes, it’s only getting worse.

Grokipedia was not always editable. When it first launched in October, it blocked about 800,000 articles written by Grok. At the time, I thought it was a mess too – racist, transphobic, clumsily flattering to Musk, and in places just copied from Wikipedia – but at least it was predictable. That changed a few weeks ago when Musk version 0.2 released and opened the door for anyone Down propose edits.

Suggesting changes on Grokipedia is basic, so basic that the site apparently doesn’t feel the need to provide instructions on how to do it. You select the text, click the “Suggest an edit” button and fill out the form with a summary of the proposed change, with the option to suggest content and provide supporting sources. Viewing edit suggestions is Grok, problematic xAI chatbot, Musk loves AI chatbot. Grok, yes, the chatbot, will also be the one to make the actual changes to the website. Most changes to Wikipedia do not require approval, but there is an dynamic community of editors who watch “recent changes” page up close.

However, it is not very clear what changes Grok is introducing. The system is confusing and not very transparent. Grokipedia tells me that “22,319” changes have been committed so far, although I have no way of knowing what the changes were, what pages they occurred on, or who suggested them. This is in contrast to Wikipedia’s well-documented edit logs, which can be sorted by pages, users or, in the case of anonymous users, IP addresses. My hunch is that many editions of Grokipedia add internal links to other Grokipedia pages within articles, though I have no firm evidence of this beyond scrolling through a few pages.

The closest I could get to seeing where the changes were actually happening was on the page home page. Below the search bar is a small panel displaying the last five or so updates in rotation, although these only provide the name of the article and that an unspecified edit has been approved. Not exactly comprehensive. These are entirely dependent on what users feel like suggesting, leading to a confusing mix of stories. The only things that seemed to come up a lot when I looked were Elon Musk and religious sites, interspersed with things like TV shows Friends AND Traitors of Great Britain and requests to draw attention to the potential medical benefits of camel urine.

Wikipedia has a clear timeline of changes detailing what happened, who did what, and why, along with visible chat logs for contentious issues. There are also numerous guidelines on editing style, resourcing requirements and processes, and you can directly compare edited versions of your site to see exactly what has changed and where. Grokipedia had no such guidelines – and showed that many requests were in chaos – but it did have an edit log. It was a nightmare that only hinted at transparency. The log – which only shows the timestamp, the suggestion, the Grok’s decision and the often convoluted reasoning generated by the AI ​​- must be scrolled manually in a small pop-up window on the side of the page, with no way to skip ahead or sort by time or edit type. This is frustrating, and it only covers a few edits and doesn’t show where changes were actually made. With more edits it would be completely useless.

Not surprisingly, Grok doesn’t seem to be the most consistent editor. This sometimes makes for confusing reading, and editing logs reveal a lack of clear guidelines for would-be editors. For example, the edit log for Musk’s bio page contains many suggestions about his daughter Vivian, who is transgender. The editors suggest using both her name and pronouns in accordance with her gender identity and that assigned at birth. While it’s almost impossible to follow exactly what happened, Grok’s decision to edit gradually meant that there was a confusing mix of both throughout the page.

As a chatbot, Grok is susceptible to persuasion. In connection with the suggested change to Musk’s bio page, the user suggested that “the veracity of this statement should be verified,” referring to the quote linking the fall of Rome to a low birth rate. In a response much more detailed than necessary, Grok dismissed this suggestion as unnecessary. When faced with a similar request, worded differently, Grok came to the opposite conclusion, accepting the suggestion and adding information he had previously deemed unnecessary. It’s not too difficult to imagine how you could ask for play to make sure changes are accepted.

While this is technically possible on Wikipedia, the site has a compact army of volunteers administrators — elected after a review process or election — to keep things under control. They enforce standards by blocking accounts or IP addresses from editing and blocking sites in the event of site vandalism or edit wars. It’s unclear whether Grokipedia has anything to do in the same direction, leaving it completely at the mercy of random people and a chatbot once called MechaHitler. The topic appeared on several websites related to, among others: with World War II and Hitler. I found repeated (rejected) requests to know that the dictator was also a painter and that far fewer people died in the Holocaust than actually did. The corresponding pages on Wikipedia were “protected”, meaning only certain accounts could edit them. Detailed diaries explaining the decision to protect them have also been preserved. If the editing system – or the site in general – were easier to navigate, I’m sure I would find more examples.

Sites like this are obvious targets for abuse, and it’s no surprise that they were among the first victims of attacks by malicious editors. These won’t be the last, and given Grokipedia’s chaotic editing system and Grok’s limited guardrails, it may soon be difficult to distinguish what is vandalism and what is not. At this rate, Grokipedia has no chance for stars, it will rather collapse into a swamp of barely readable disinformation.

Follow topics and authors from this story to see more events like this in your personalized homepage feed and receive email updates.


Latest Posts

More News