xAI just released Grok 3, its latest AI model. Access begins today for premium X platform subscribers in the U.S. and via a separate subscription for Grok’s web and app versions. The model runs on xAI’s Colossus supercomputer in Memphis, now upgraded to 200,000 Nvidia GPUs (from its prior 100,000), used to process training data.
Grok 3 underwent testing on standardized benchmarks in mathematics, science, and coding, with xAI reporting higher scores than OpenAI’s o1, DeepSeek V3, Google’s Gemini, and Anthropic’s Claude models. (These results are in the process of being verified by third parties.) Training included synthetic data, enabling the model to adjust outputs for logical consistency. A new feature, “Deep Search,” integrates with Grok 3 as a search engine focused on contextual accuracy. During an X-streamed demonstration, xAI founder Elon Musk noted that the model remains in beta, with updates planned daily and a voice assistant addition scheduled later.
Competition in AI development is fierce. OpenAI, co-founded by Musk in 2015 before his departure, released OpenAI o1 in September 2024, emphasizing reasoning tasks. DeepSeek, a Chinese firm, released an open-source model matching o1’s performance, supposedly achieved with less computational power despite U.S. restrictions on Nvidia GPU exports to China. xAI’s GPU cluster expansion aligns with its training needs for Grok 3’s neural network architecture.
xAI says Grok 3 offers practical utility. Its benchmark performance suggests reliability for data analysis tasks like supply chain logistics or market trend forecasting. According to Musk, the Colossus infrastructure – among the largest GPU clusters globally – supports the model’s capacity to handle large datasets. The company says integration with X will provide access to real-time data streams that are relevant for business intelligence applications.
We are currently testing the model and will report what we find. There are some who say that there is no moat for foundational model builders. If there is one, I think it’s filled to the brim with money. Nvidia H100 GPUs cost between $30,000 and $40,000 each. Even with Elon’s discount—who pays retail when you’re colonizing Mars?—it’s still a $3-to-$5 billion investment in brute-force compute. It may not be a deep moat, but it’s the kind of shallow puddle that drowns startups while Musk does cannonballs off the high dive.
Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.