Orbital AI: Tech titans bring up raw futurism on value of space-based AI

In a mind-bending fireside chat that's already sparking interstellar buzz, Elon Musk and NVIDIA CEO Jensen Huang dropped a bombshell at the 2025 US-Saudi Investment Forum: The future of AI isn't buried in desert or tundra data centres — it's orbiting above us.
With over 2.7 million views in hours, Musk's X post sharing the clip has tech titans and space nerds alike rethinking everything from energy crises to the Kardashev scale.
This isn't sci-fi; it's the blueprint for AI's next frontier.
Space is where unlimited solar power and zero-gravity cooling make Earth look like a quaint starter kit.
Picture this: Musk, the visionary, leans into the mic with Huang nodding along, painting a picture of AI supercomputers launched via Starship — tiny, featherweight racks sipping endless sunlight 24/7.
And no batteries required.
"Long before you exhaust potential energy sources on Earth," Musk declares, "the lowest cost way to do AI compute will be with solar-powered AI satellites."
Huang chimes in on the hardware hurdles, revealing how today's 2-tonne GB300 racks are bloated by cooling tech that becomes obsolete in the vacuum of space.
Radiative cooling? Check.
Panels half the weight without Earth's pesky glass and frames?
Double check.
The result? Terawatt-scale AI that's not just feasible but cheaper than terrestrial setups.
And, potentially, this could happen within 4-5 years.
This duo's synergy is electric: Musk's SpaceX muscle meets Huang's GPU empire, tackling AI's voracious appetite head-on.
Earth guzzles about 460 GW annually for everything — Musk notes 300 GW just for AI would hog two-thirds of US power, while a terawatt?
"Impossible" without blackouts and endless power plant sprawl.
Space flips the script: Harvest a sliver of the sun's output (Earth gets a measly 1/2-billionth), scale infinitely, and sidestep the planet-heating nightmare of ground-based mega-farms.
The implications?
Game-changing for climate hawks (bye-bye fossil fuel guzzlers), investors (hello, orbital data center boom), and dreamers (Starlink meets AI overlords). As replies flood in—from Litecoin fans pitching space miners to memes of Musk as a galactic landlord—this post isn't just viral; it's a launchpad. Civilization's upward arc demands we don't sleep on it. Who's ready to compute among the stars?
Here's an extract from the X post (with video transcript summary and key quotes)
Elon Musk's X post contains no written caption beyond a video link, but the 4:39 clip captures a dynamic exchange.
Here's the full transcribed dialogue for context:
Moderator: "AI in space, is that possible?"
Elon Musk: "Yes. If civilisation continues — which it probably will — then AI in space is inevitable. I always have to preface: We shouldn't take civilisation for granted. We need to ensure that civilisation has an upward arc. Any student of history knows that civilisations have life cycles. Hopefully we are in a strong upward arc — I think we are for now — but we don't want to take that for granted or be complacent."
Musk (on energy scale): "The way to think of AI is that any meaningful percentage — where you're using even a millionth of the sun's energy —you must [go to space]. So, once you think in terms of a Kardashev 2-scale civilization, which is what percentage of the sun's energy are we turning to useful work?
"Then it becomes obvious that space overwhelmingly matters. The Earth only receives roughly 1 or 2 billionth of the sun's energy. So if you want to have something that is say a million times more energy than Earth could possibly produce, you must go into space. This is where it's kind of handy to have the Space Company, I guess."
Jensen Huang: "Easier to cool chips in space too."
Musk: "Yes, easier to cool chips in space. Yeah, there's definitely no water in space. We're gonna do something that doesn't involve water — just hang out. Well, it's just gonna radiate."
Musk (on timeline and costs): "My estimate is that the cost-effectiveness of AI in space will be overwhelmingly better than AI on ground. Long before you exhaust potential energy sources on Earth — meaning like I think even perhaps in a 4 or 5 year time frame — the lowest cost way to do AI compute will be with solar-powered AI satellites. So I'd say not more than 5 years from now."
Huang (on hardware): "And just look at the supercomputers we're building together. Let's say each one of these racks is 2 tons. Out of that, two times 1.95 is probably for cooling. Right? Just imagine how tiny that little supercomputer — each one of these GB300 racks — will be just a tiny thing."
Musk (on scaling): "And electricity generation is already becoming a challenge. So if you start doing any kind of scaling for both electricity generation and cooling, you realise: Okay, space is incredibly compelling. So like, let's say you wanted to do 2 or 300 gigawatts per year of AI compute... Yeah, it's very difficult to do that on Earth.
"The US average electricity usage, last time I checked, was around 460 gigawatts per year average usage. So something like say 300 gigawatts a year — that would be like two-thirds of US electricity production for your [AI]. There's no way. Building power plants at that level? And if you take it up to say a terawatt per year? Impossible. Yeah, like you have to do that in space. There just is no way to do a terawatt per year on Earth."
Musk (on space advantages): "In space, you've got continuous solar. You actually don't need batteries because it's always sunny...The solar panels actually become cheaper because you don't need glass or framing. Um, and the cooling is just radiative. So that's why I think... the dream."
Huang: "Yes, that's the dream."
Sign up for the Daily Briefing
Get the latest news and updates straight to your inbox