5 Minutes
Off the coast of Shanghai, a bright yellow capsule is being readied for an experiment that could reshape how we think about server farms. A Chinese maritime tech firm plans to sink a pod of servers into the sea in mid-October 2025, promising dramatic cuts in the energy needed for cooling — but raising engineering and ecological questions along the way.
Why put servers under the sea?
Traditional data centers rely on huge volumes of air conditioning or evaporative cooling. Underwater facilities use a simple advantage: the ocean is cold. Highlander, the maritime equipment company behind the Shanghai project, says ocean currents can keep submerged servers cool without the energy-hungry chillers used on land. "Underwater operations have inherent advantages," says Yang Ye, Highlander’s vice president.
The company claims the setup can cut cooling energy by roughly 90 percent. The capsule will serve clients including China Telecom and a state-owned AI computing firm, and is part of a wider government push to reduce the carbon footprint of data infrastructure.
Engineering the deep: design and power
Building a data center that lives under waves is a very different engineering challenge to erecting one onshore. Highlander assembled the pod on a wharf near Shanghai in separate modules before sealing it into a steel capsule. To fight saltwater corrosion, the shell is coated with a protective mix that includes glass flakes. An elevator links the submerged pod to a segment that remains above water so maintenance teams can access internal systems.
On the energy side, Highlander says the installation will draw almost all of its electricity from nearby offshore wind farms. The company estimates more than 95 percent of the power will come from renewables, and earlier state-backed projects — including a 2022 test off Hainan province — received government subsidies to push development forward.

From lab trial to commercial questions
Microsoft tested a similar idea off Scotland in 2018, deploying a submerged data pod to study survivability and cooling. The experiment was recovered in 2020 and Microsoft described the trial as a technical success, but did not move to a commercial rollout. Experts say that shows the concept works at small scale, but scaling to megawatt-class operations brings new challenges.
Benefits, risks and the debate over impact
Proponents point to several clear upsides:
- Significant reduction in cooling energy and associated carbon emissions, if the company’s 90% figure holds at scale.
- Proximity to coastal clients and submarine cables, potentially shortening network distances.
- Integration with offshore renewables for low-carbon power.
But critics and independent researchers urge caution. Laying high-capacity fiber and power links between the seabed and shore is complex and costly. A team at the University of Florida and researchers in Japan have found that water can conduct sound-based attacks, potentially exposing underwater servers to new security threats. And then there’s the little-discussed problem of heat.
Marine ecologist Andrew Want of the University of Hull warns that even moderate increases in local water temperature could change species behavior, attracting some organisms while driving others away. Highlander points to an independent assessment of its 2020 Zhuhai trial that found surrounding waters remained within acceptable temperature limits. Still, Shaolei Ren of the University of California, Riverside, says thermal effects scale with capacity: "For megawatt-scale data centers underwater, the thermal pollution problem needs to be studied more carefully."
Highlander acknowledges construction was tougher than expected. Engineer Zhou Jun admits the onshore assembly, sealing and long-term corrosion protection required more work than planners initially predicted. The project also leans on subsidies: Highlander received 40 million yuan (about $5.6 million) for its Hainan trial, illustrating how government support is helping move prototypes toward commercial offers.
Where underwater data centers fit in the ecosystem
Experts suggest these facilities are more likely to complement rather than replace traditional data centers. They could serve niche roles — coastal AI clusters, latency-sensitive clients, or facilities tethered to offshore renewables — while mainland data centers continue to host large-scale workloads.
Imagine a future where some AI training runs offshore, cooled by ocean currents and fed by wind turbines; it’s an alluring vision, but scaling that vision safely and sustainably will demand more research, robust environmental monitoring and careful engineering. For now, projects like the Shanghai pod are proving technical feasibility while testing the economic and ecological trade-offs at stake.
Source: sciencealert
Leave a Comment