Microsoft Data Sub Yields Reliability Gains
Microsoft said its two-year trial run with an underwater datacenter is a concept that holds water.
The company commenced Project Natick in 2015 during a proof-of-principle deployment in the Pacific Ocean. It launched its latest prototype in 2018 to determine whether a submerged datacenter the size of a shipping container was reliable, sustainable and economically practical. The data submarine was lowered into 117 feet of water on the seafloor off Scotland’s Orkney Islands. The site was chosen since the island’s power comes from sustainable energy sources.
“The overarching goal of Project Natick is to evaluate the feasibility of underwater datacenters,” said Spencer Fowers, the project’s principal researcher. The initial goal was determining whether servers could be installed in a container without leaking, then wait to see how long they function in a submerged datacenter.
The next phase was demonstrating a submerged datacenter as a “manufacturable, production-scale component,” Fowers added. The central component fits either on a trailer of container ship, and can be scaled to a desired size.
Along with energy usage, the project also addresses some of the operating drawbacks of terrestrial datacenters, including humidity and day/night temperature swings that are hard on IT hardware, including corrosion on components. Then there are land acquisition and citing requirements in congested urban areas.
Microsoft engineers pumped the oxygen out of the submerged datacenter and controlled humidity and monitored servers remotely.
The team tracked the submerged datacenter’s performance against a land-based version with the same components. “We see one-eighth the failure rate in the ocean datacenter than we do on land, Fowers said.
The concept proven, Microsoft said the next steps include demonstrating the submersibles can be retrieved and recycled.
Project Natick was launched by the public cloud vendor on the premise that cloud computing consumption will soar, requiring more and larger datacenters. That infrastructure will consume ever greater amounts of energy for cooling and powering servers filled with energy-hogging processors running demanding AI and analytics workloads.
Cooling, which accounts for a large share of terrestrial datacenters’ energy budget, is an obvious advantage of an underwater datacenter. Below about 100 meters, water temperature can hover just above freezing. Project engineers note that consistent subsurface cooling is a key advantage. The Project Natick design could also leverage heat-exchange plumbing used in submarines.
As long was the container remains water tight, remote operators can monitor systems without bumping into them—another advantage.
Microsoft notes that nearly half the U.S. population lives along the coasts. If scaled, underwater datacenters submerged just offshore could potentially bring data closer to users than land-based regional datacenters dependent on high-speed interconnections that strain to deliver real-time data, transactions, video, online gaming and other services.
At first glance, Project Natick and other “data barge” schemes seem, well, all wet.
With U.S. coast lines lately bursting into flame, however, the blue-water concept may have a future. “We are populating the globe with edge devices, large and small,” said William Chappell, vice president of mission systems for Microsoft Azure. “To learn how to make datacenters reliable enough not to need human touch is a dream of ours.”
A video describing Microsoft’s Project Natick is here.
Related
George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).