For the process of training Grok 3, xAI has been using a large data center located in Memphis, containing 200,000 GPUs that have been used. Musk, said in a post on X that Grok 3 was developed with “10x” commuting power than its previous model, Grok 2 leveraging an expanded training data set that also includes filling from court cases.
Mask stated on Monday during a live-streamed presentation that “Grok 3 is an order of magnitude more capable than Grok 2,” Also adding “[It’s a] maximally truth-seeking AI, even if that truth is sometimes at odds with what is politically correct.”, reported TechCrunch.
Grok also makes part from a family of models, including Grok 3 mini which is able to respond to questions more quickly sparing in some cases accuracy. However, not all models and features that are related to Grok 3 are available yet, however, they are set to be available in the near future.
xAI also made claims that Grok 3 beats GPT - 4o on several benchmarks including AIME (which is the equivalent of model performance on different math questions) and GPQA (which gathers models that use PhD-level physics, biology, and chemistry questions). The early version of the Grok 3 also obtained a competitive score in a Chatbot Arena.
The two models available with the latest release, Grok 3 Reasoning and Grok 3 mini Reasoning, have the capability to “think through” tasks, having a similar “reasoning” process with the one available in OpenAI’s o3 mini and DeepSeek’s R1 model.
Musk also said that Grok 2 will become open-source in the near future “Our general approach is that we will open-source the last version [of Grok] when the next version is fully out,” he also added. “When Grok 3 is mature and stable, which is probably within a few months, then we’ll open-source Grok 2.”