xAI just released Grok 3, its latest AI model. Access begins today for premium X platform subscribers in the U.S. and via a separate subscription for Grok’s web and app versions. The model runs on xAI’s Colossus supercomputer in Memphis, now upgraded to 200,000 Nvidia GPUs (from its prior 100,000), used to process training data.
Grok 3 underwent testing on standardized benchmarks in mathematics, science, and coding, with xAI reporting higher scores than OpenAI’s o1, DeepSeek V3, Google’s Gemini, and Anthropic’s Claude models. (These results are in the process of being verified by third parties.) Training included synthetic data, enabling the model to adjust outputs for logical consistency. A new feature, “Deep Search,” integrates with Grok 3 as a search engine focused on contextual accuracy. During an X-streamed demonstration, xAI founder Elon Musk noted that the model remains in beta, with updates planned daily and a voice assistant addition scheduled later.
Competition in AI development is fierce. OpenAI, co-founded by Musk in 2015 before his departure, released OpenAI o1 in September 2024, emphasizing reasoning tasks. DeepSeek, a Chinese firm, released an open-source model matching o1’s performance, supposedly achieved with less computational power despite U.S. restrictions on Nvidia GPU exports to China. xAI’s GPU cluster expansion aligns with its training needs for Grok 3’s neural network architecture.
xAI says Grok 3 offers practical utility. Its benchmark performance suggests reliability for data analysis tasks like supply chain logistics or market trend forecasting. According to Musk, the Colossus infrastructure – among the largest GPU clusters globally – supports the model’s capacity to handle large datasets. The company says integration with X will provide access to real-time data streams that are relevant for business intelligence applications.
We are currently testing the model and will report what we find. There are some who say that there is no moat for foundational model builders. If there is one, I think it’s filled to the brim with money. Nvidia H100 GPUs cost between $30,000 and $40,000 each. Even with Elon’s discount—who pays retail when you’re colonizing Mars?—it’s still a $3-to-$5 billion investment in brute-force compute. It may not be a deep moat, but it’s the kind of shallow puddle that drowns startups while Musk does cannonballs off the high dive.
As always your thoughts and comments are both welcome and encouraged. Just reply to this email. -s
P.S. If you're wondering if Grok 3 will impact how you choose the right AI models for your applications, come join us at MMA CMO AI Transformation Summit (March 18, 2025 | NYC). I'm facilitating and co-producing this half-day invitation-only event which will provide insights into the strategies, technologies, and leadership practices for CMOs who are driving successful AI transformations across the world’s best marketing organizations. Request your invitation.
Shelly Palmer is the Professor of Advanced Media in Residence at Syracuse University’s S.I. Newhouse School of Public Communications and CEO of The Palmer Group, a consulting practice that helps Fortune 500 companies with technology, media and marketing. Named LinkedIn’s “Top Voice in Technology,” he covers tech and business for Good Day New York, is a regular commentator on CNN and writes a popular daily business blog. He's a bestselling author, and the creator of the popular, free online course, Generative AI for Execs. Follow @shellypalmer or visit shellypalmer.com.