Big ideas for big compute: Cisco looks beyond Earth
Chuck Robbins, CEO of Cisco, used a recent interview to raise a provocative question: if the rapid growth of AI will require vast new data centers, should some of that capacity live off-planet? As a longtime supplier of the routers, switches, and software that make the internet and cloud possible, Cisco’s interest in where data centers are built is both practical and strategic.
The impetus is simple and immediate: large data centers are noisy, power-hungry neighbors and new builds have met growing local resistance. Moving some compute to space — an idea also mentioned by other industry leaders — could lessen terrestrial environmental strain, free up land and power resources, and create fresh ways to scale AI infrastructure without worsening community impacts.
There are clear benefits worth celebrating: thinking about space-based or highly distributed compute reframes the infrastructure challenge as an opportunity for cleaner, more flexible deployment models. It also highlights Cisco’s role in enabling ambitious solutions — from fiber and edge networks to the datacenter interconnects that AI developers depend on.
Of course, the proposal is exploratory: technical, economic, and regulatory hurdles remain. But the broader takeaway is positive — major infrastructure players are actively innovating to balance AI growth with sustainability and community needs. Below are some potential upsides Cisco’s public thinking helps illuminate:
- Reduced local environmental footprint and land use by offloading some capacity.
- New market opportunities for companies that build space-hardened networking and cooling systems.
- Incentive for cross-industry collaboration on standards, latency mitigation, and regulation for extraterrestrial infrastructure.