A few years ago, technology companies had a bold vision of building a network of artificial intelligence data centers — massive warehouses full of computers that store and process data for websites, businesses and governments. However, the truth soon emerged. AI data centers consume huge amounts of energy, and their impact on energy use, water consumption and carbon emissions is becoming a growing concern.
However, Silicon Valley has proposed a solution: put GPUs in space and build large AI data centers outside Earth. Billionaires, including Jeff Bezos, Elon Musk, and Sam Altman, have plans to put data centers in orbit and even on the moon. Now, Google has joined the effort, confirming this week that the company is working on technology aimed at creating scalable networks of orbital TPUs.
What is Project Suncatcher and how does it work?
Google’s new project is called Suncatcher, and it’s a research initiative exploring how solar-powered satellite constellations could host data centers in space. The idea is similar to satellite constellations such as Starlink, which provide high-speed Internet services from space via thousands of satellites orbiting around them.
The difference is that Google aims to deploy high-performance AI accelerators in space and build space infrastructure. The vision of scalable orbital data centers relies on solar-powered satellites, with free-space optical links connecting nodes to a distributed network.

“The Sun is the ultimate energy source in our solar system, emitting more than 100 trillion times more energy than humanity’s total electricity output. In the right orbit, a solar panel could be up to 8 times more productive than Earth’s, producing energy almost continuously,” Google said in its preliminary paper titled “Toward a Future Design of a Highly Scalable Space-Based AI Infrastructure System.”
Similar to how a network of satellites in low Earth orbit broadcasts the Internet from space, Google believes that placing massive computers, especially TPUs, connected by free-space optical links capable of transferring data at tens of terabytes per second, could enable space-based AI data centers.
What are the challenges?
But the immediate question is how to keep satellites connected at high speeds as they orbit the Earth. On the ground, nodes in the data center communicate via ultra-fast optical backhaul chips.
Story continues below this ad
Maintaining high-speed communications between orbiting servers will require wireless solutions capable of operating at tens of terabits per second. Early tests on the ground have shown two-way speeds of up to 1.6 terabytes per second, and Google believes this could increase over time.
The cost of electricity on the ground is a major challenge for operating data centers in the long term, and deploying solar panels on the roof would not be very efficient. However, in space, Google points out that solar panels could be up to eight times more efficient than they are on Earth. The combination of uninterrupted sunlight and high efficiency means much more power is available for data processing.
Another problem is physics: force decreases with the square of distance. Google points out that the satellites will need to maintain a distance of one kilometer or less, requiring a tighter formation than any currently operational constellation. In fact, Google has developed analytical models that suggest that satellites placed just a few hundred meters apart would require only “modest station-keeping maneuvers.”
But this is not the end of the challenges. Appliances designed for the space also need to be reinforced enough to withstand extreme temperatures and radiation. Google proposes to reuse components that were originally developed for use on Earth.
Story continues below this ad
The company has already conducted radiation testing on its TPUs (Trillium v6e), and the results were “promising.” Google tested the Trillium TPU v6e chip under a 67-meV proton beam and found it to be surprisingly resistant. “No significant failure was attributed to TID up to the maximum dose tested of 15 krad (Si),” the newspaper noted.
Building a space dream
Project Suncatcher is still a research project in its early stages, but Google plans to test the concept in space soon. In fact, the company hopes to launch a pair of prototype satellites equipped with TPUs by early 2027. While launch costs for these first AI-powered orbiters are expected to be very high, Google is looking to the mid-2030s, when launch costs are expected to drop to at least $200 per kilogram.
At this point, space-based data centers could become as economical as their terrestrial counterparts.

It may sound like something out of a science fiction movie, but big tech companies are racing to build data centers in space. Sam Altman, CEO of OpenAI, admits that the massive expansion of AI data centers will come to an end. “I think a lot of the world has become covered by data centers over time,” he told podcast host Theo Vaughn.
Story continues below this ad
While it would be nearly impossible to build a million-square-foot data center in space, Altman has proposed creating a Dyson sphere of data centers around the sun, a massive hypothetical structure built around a star to capture most of its energy.
However, building an entire data center in space, even one the size of a single building, would require far more resources than building on Earth. However, several startups, including Starcloud, Axiom, and Lonestar Data Systems, have raised millions in recent months with the promise of developing them.
Earlier this year, Florida-based Lonestar Data Systems claimed to have successfully tested a small data center, the size of a hardback book, sent to the moon aboard the Athena Lunar Lander, launched by Intuitive Machines and SpaceX. Lonestar says placing data centers on the Moon will provide customers with secure and reliable data processing while taking advantage of the Moon’s abundant solar energy to power them.
High demand
One reason technology companies want to build data centers in space is the growing demand for artificial intelligence computing, which they believe cannot be met by existing infrastructure. The United States alone has about 5,400 data centers, and this number is steadily increasing. In fact, more countries are now building data centers focused on AI, the backbone of the global AI infrastructure.
Story continues below this ad
AI tools like ChatGPT and Gemini consume large amounts of power, even for short periods of time, and typically run in data centers owned by AI companies. The bigger question is where these data centers are located, as their massive energy consumption results in a much higher carbon footprint.
Training AI models also use millions of liters of fresh water to keep computer systems cool while processing massive amounts of training data. As a result, demand for data centers has soared, with annual growth expected to rise between 19 percent and 22 percent by 2030, according to global management consulting firm McKinsey.
Placing data centers in space can solve many problems, including providing 24/7 solar power and mitigating air, water, and noise pollution. While new data centers are built on the ground quickly, finding a suitable location often takes a long time. There is a long wait for approval from local governments, and residents often don’t want to build it nearby, which is why technology companies want to build data centers in space. Well, there is almost no regulation in the space yet.
But some challenges remain: Cooling the devices will be a particular problem, because traditional cooling systems don’t work well without gravity. In addition, space weather can damage electronics and internal components, and the ever-increasing amount of space debris poses a risk to physical hardware. However, given all these factors, technology companies are committed to investing billions of dollars in putting AI data centers in space.
(tags for translation)Suncatcher Project




