Elon Musk Claims Earth Is the Wrong Place for AI, Says We Have 30 Months Left

Elon Musk made a striking prediction during a recent conversation with Dwarkesh Patel. He stated that within 30 to 36 months, space will become the most economically compelling location for artificial intelligence infrastructure.

His reasoning centers on one critical resource that Earth is rapidly running out of: electricity.

“In 36 months, but probably closer to 30 months, the most economically compelling place to put AI will be space,” Musk stated plainly. The assertion wasn’t based on speculative futurism but on practical engineering constraints he’s already encountering at companies like xAI and Tesla.

Musk’s central argument revolves around electrical capacity. Outside of China, global electricity output remains essentially flat, while chip production grows exponentially.

“The output of chips is growing pretty much exponentially, but the output of electricity is flat. So how are you going to turn the chips on?” he asked.

He pointed out that the United States currently uses only half a terawatt of power on average. Scaling AI infrastructure to even one terawatt, which he considers necessary for reaching the singularity, would require doubling America’s entire electrical consumption.

“Can you imagine building that many data centers, that many power plants?” Musk challenged.

The xAI founder described the difficulty of bringing just one gigawatt online for the Colossus project in Memphis. The effort required coordinating multiple gas turbines, navigating permit issues across state lines, and building power infrastructure in Mississippi when Tennessee proved too difficult.

Even then, the actual power generation needed exceeded naive calculations by significant margins due to cooling, networking, and operational reserve requirements.

Space offers five fundamental advantages, according to Musk. Solar panels in space operate at approximately five times the effectiveness of ground-based installations.

“You don’t have a day-night cycle, seasonality, clouds, or an atmosphere in space,” he explained, noting that Earth’s atmosphere alone causes about 30 percent energy loss.

Beyond efficiency, space eliminates the need for batteries since sunlight is constant. There are no weather-related failures, no permitting battles, and no competition with other uses for the same electrical capacity. “It’s actually much cheaper to do in space,” Musk concluded.

The cost comparison becomes even more favorable when accounting for terrestrial obstacles. Solar tariffs in the United States run several hundred percent, domestic production capacity is minimal, and acquiring land with proper permits for utility-scale solar requires years.

“Try getting the permits for that. See what happens,” Musk said sarcastically about covering Nevada in solar panels.

Musk was blunt about hardware constraints. Gas turbine manufacturers are sold out through 2030, with the limiting factor being specialized vanes and blades for the turbines. Only three companies worldwide manufacture these components, and all face massive backlogs.

“The limiting factor is the vanes and blades. There are only three casting companies in the world that make these, and they’re massively backlogged,” he said, adding that even when you can source everything else, these critical components take 12 to 18 months longer to obtain.

For ground-based data centers, the power requirements extend far beyond the chips themselves. Cooling systems, networking hardware, storage systems, and operational reserves multiply the base load significantly. For every 110,000 Nvidia GB300 chips at xAI, roughly 300 megawatts of generation capacity is required. This nearly triple the chips’ direct power consumption.

When asked about the balance between terrestrial and space-based AI in five years, Musk made an prediction: “Five years from now, my prediction is we will launch and be operating every year more AI in space than the cumulative total on Earth.”

He elaborated that launching a few hundred gigawatts per year to space within that timeframe is feasible, potentially reaching a terawatt annually before fuel supply for rockets becomes constraining.

This would require approximately 10,000 Starship launches per year, one every hour, with each carrying the solar arrays, radiators, and computing hardware needed for orbital data centers.

Musk acknowledged that SpaceX is preparing for exactly this scale. “SpaceX is gearing up to do 10,000 launches a year, and maybe even 20 or 30,000 launches a year,” he said, adding that as few as 20 to 30 Starship vehicles could maintain this cadence given the orbital mechanics and ground track requirements.