In a move that feels ripped from a sci-fi novel, NVIDIA is throwing its weight behind Starcloud, a startup with the audacious goal of putting data centers into orbit. The Redmond-based company, a graduate of Nvidia’s Inception program, claims this celestial solution will offer energy costs up to 10 times lower than their terrestrial counterparts. The plan involves launching a fridge-sized satellite, the Starcloud-1, in November, which will feature the cosmic debut of an NVIDIA H100 GPU—a chip more accustomed to air-conditioned server rooms than the vacuum of space.
Starcloud’s pitch hinges on two fundamental principles of space: unlimited solar power and a literally cosmic-sized heat sink. By operating in orbit, the data centers would have constant access to solar energy, eliminating the need for grid power or backup batteries. More critically, they would use the near-absolute zero temperature of space for passive cooling, radiating heat away without the need for the millions of tons of water consumed by Earth-based facilities. It’s an elegant solution, provided you can overlook the monumental cost and complexity of launching and maintaining high-performance electronics outside the Earth’s atmosphere.

The long-term vision is even more ambitious, with plans for a 5-gigawatt orbital data center featuring solar and cooling panels stretching approximately 4 kilometers in width and length. While the initial launch is a demonstrator, Starcloud’s CEO, Philip Johnston, boldly predicts that “in 10 years, nearly all new data centers will be built in outer space.” This vision is fueled by plummeting launch costs and the insatiable energy demands of AI, which are projected to cause global data center electricity consumption to more than double by 2030.

Why is this important?
The explosive growth of AI is creating an energy consumption crisis. Terrestrial data centers already account for 1-1.5% of global electricity use, a figure set to skyrocket. Starcloud’s plan, while astronomically ambitious, represents a serious attempt to solve a planet-sized problem. By moving the energy-intensive core of AI infrastructure off-world, it could theoretically decouple the growth of AI from Earth’s energy and water constraints. It’s a high-stakes gamble on whether the economics of space launch can mature faster than the environmental cost of computation on Earth.