Elon Musk has outlined his plans for the way forward for his xAI mission, together with a brand new open supply street map for the corporate’s Grok fashions.
Over the weekend, Musk reiterated that his plan is for xAI to be working the equal of fifty million H100 chips throughout the subsequent 5 years.
Nvidia’s H100 chips have turn out to be the business normal for AI tasks, as a consequence of their superior processing capability and efficiency. A single H100 chip can course of round 900 gigabytes per second, making it excellent for big language fashions, which depend on processing billions, even trillions of parameters rapidly, to be able to present quick, correct AI responses.
No different chips have been discovered to carry out as nicely for this process, which is why H100’s are in excessive demand, larger than Nvidia itself can sustain with. And all the main AI tasks want to stockpile as many as they will, however at a value of round $30,000+ per unit, it’s a significant outlay for large-scale tasks.
At current, xAI is working round 200k H100s inside its Colossus knowledge heart in Memphis, whereas it’s additionally engaged on Colossus II to broaden on that capability. OpenAI can be reportedly working round 200k H100s, whereas Meta is on monitor to blow them out of the water with its newest AI investments, which can take it as much as round 600k H100 capability.
Meta’s techniques aren’t absolutely operational as but, however when it comes to compute, it does appear, at this stage, like Meta is main the cost, whereas it’s additionally planning to take a position a whole bunch of billions extra into its AI tasks.
Although there’s one other consideration inside this, in extra customized chips that every firm is engaged on to cut back reliance on Nvidia, and provides themselves a bonus. All three, in addition to Google, are growing their very own {hardware} to energy up their AI fashions, and since these are all inner tasks, there’s no figuring out precisely how a lot they’ll enhance capability, and allow better effectivity on the again of their present H100s.
So we don’t know which firm goes to win out within the AI compute race within the longer-term, however Musk has clearly acknowledged his intention to ultimately outmanoeuvre his opponents with xAI’s technique.
Possibly that’ll work, and definitiely, xAI already has a repute for increasing its tasks at pace. However that might additionally result in setbacks, and on steadiness, I’d assume that Meta is probably going finest positioned to finish up with probably the most capability on this entrance.
After all, capability doesn’t essentially outline the winner right here, as every mission must provide you with worthwhile, modern use instances, and set up partnerships to broaden utilization. So there’s extra to it, however from a pure capability standpoint, that is how Elon hopes to energy up his AI mission.
We’ll see if xAI is ready to safe sufficient funding to maintain on this path.
On one other entrance, Elon has additionally outlined xAI’s plans to open supply all of xAI’s fashions, with Grok 2.5 being launched on Hugging Face over the weekend.
That implies that anyone can view the weights and processes powering xAI’s fashions, with Elon vowing to launch each Grok mannequin on this manner.
The thought, then, is to boost transparency, and guarantee belief in xAI’s course of. Which is attention-grabbing, contemplating that Grok 4 particularly checks in on what Elon thinks about sure points, and weighs that into its responses:
I’m wondering if that can nonetheless be part of its weighting when Grok 4 will get an open supply launch.
As such, I wonder if that is extra PR than transparency, with a sanitized model of Grok’s fashions being launched.
Both manner, it’s one other step for Elon’s AI push, which has now turn out to be a key focus for his X Corp empire.