While Apple updates its iPhones to iOS 14 and sells new Apple Watch with iPad, Sony is polishing the PlayStation 5 for release, NVIDIA is not coping with the hype for new video cards and is completely occupied with the takeover of ARM, Microsoft has completed the Project Natick experiment.
This week, Microsoft completed a two-year experiment with an underwater data center Project Natick, which it successfully submerged two years ago on the seabed off the Scottish coast. The purpose of the experiment is to study the possibilities of working in an isolated environment for a long time, to determine the maintainability and cost of improving the system for subsequent operation in similar conditions. Also, the experiment had to show the positive aspects of placing such infrastructure under the water column of the sea.
Project Natick consists of 864 servers (total capacity of 27.6 petabytes) cleverly packaged in a huge capsule. Its role was played by a steel container, which was designed like a submarine, all the air was pumped out of it and replaced with nitrogen. Then they insulated with sealant, leaving a small amount of power and data cables outside.
Microsoft engineers suggested that such a data center has much fewer drawbacks than the more familiar one located on the ground:
- Cool underwater cooling – radiators are placed in dedicated housings outside the sealed compartment, through which water continuously flows. The release of heat from so many racks with colossal computing power (by human standards) does not in any way affect the temperature of the water on the seabed, because of this, the “heat spreader” does not change its “cool” temperature at any time.
- It is simple, cheap and safe from an environmental point of view than building a refrigerator with the implementation of the most dangerous for the environment chemical elements and their compounds.
- In the sea, there are no strong jumps in temperature, in contrast to the surface. Whether during the day or at night – the temperature on the seabed does not change, since the sun’s rays do not reach the level of the object. The change of seasons is measured, does not depend on external weather factors and short-term temperature changes.
- The inside of the “container” is empty, cold and dry. There is no humidity that is harmful to electronics.
- There are no abnormal situations associated with the human factor.
While investigating this data center, Microsoft had to plug in power from onshore solar panels and wind turbines. After a while, the company plans to fully deploy data centers on the seabed, and will connect them to the new, created infrastructure at the site of underwater data centers. This option will provide current to data centers through the so-called “green energy” – wind and waves. At sea, both the first and the second are enough.
The disadvantage of the system, with so many advantages, is its inaccessibility for partial repair. The design of the data center is designed to operate under water for 5 years. After 5 years, the data center is shut down, lifted out of the water, and transported to the “workshop” for full-fledged maintenance. In the future, after passing all the checks, repeated operation is not excluded.
Microsoft managed to deduce a pattern and provided statistics: the percentage of component failure is 87.5% lower compared to a similar ground-based data center.
The company is quite satisfied with the test results. But this is only superficial data. The study and analysis of the collected information continues. Naturally, so far we are not talking about the full launch of Microsoft’s underwater data centers.