Exoscale tests Digger's new cooling system for data centers

Austrian cooling specialist Diggers combines chip and cabinet cooling in closed systems for energy-efficient AI computing.

listen Print view

(Image: Diggers)

5 min. read

The Austrian cooling specialist Diggers wants to manage completely without external air cooling, utilize up to 98% of the server waste heat and at the same time save 50% of the cooling effort. To achieve this, it combines chip and cabinet cooling in a closed system.

The systems presented so far, which are half the size of a server rack, can accommodate four housing-less servers, each with two CPUs and three GPU modules. Two cabinets or boxes can be arranged on top of each other. Each box forms a self-contained system and has its water circuit, which is connected to the secondary circuit. This allows the protection class to be increased to IP67. Diggers states that the usable output temperature is 50 °C; tests with higher temperatures have already been successful.

Each box forms a self-contained system. It holds four housing-less servers with three GPU modules each and has its own water circuit connected to the secondary circuit.

(Image: Diggers)

The chip cooling removes the heat from the largest heat sources via DLC heat sinks (Direct Liquid Cooling). The waste heat from the other server components, which can only be cooled with disproportionate effort or not at all using DLC, is transferred to the water circuit by eight heat exchangers. These air-to-water heat exchangers form an air-permeable wall behind the four servers and are equipped with fans that move the air in the cabinet from the front to the rear over the server components and through the heat exchangers.

Eight heat exchangers equipped with fans draw the residual heat out of the cabinet.

(Image: A1 digital)

The heat is then only moved through the data center itself via the secondary water circuit. Raised floors and classic data center cooling are therefore obsolete. This makes Diggers cabinets just as suitable as other systems with closed cabinet cooling for other environments, and the IP67 version is also suitable for harsh and outdoor environments.

A redundant and monitored vacuum pump reduces the pressure in the water pipes so that air is drawn into the pipes in the event of a leak and the pump draws off the water. If its operating time of typically 2 times 10 minutes per day suddenly increases, this also triggers an alarm.

Diggers has come up with another special feature for GPU cooling. The three GPU modules are mounted at a distance of around 10 cm above the server board and equipped with their large heat sinks. However, these are not made of copper, but of aluminum. Among other things, aluminum is corrosion-resistant and can be recycled at a lower energy cost – with 5% compared to primary aluminum. The aluminum cooling elements of the GPUs are still connected to the water circuit via flexible hoses. In the future, they should also be permanently integrated into the circuit air.

The server modules are enclosed in a frame instead of a housing. On the left you can see the mainboard with CPUs and RAM, on the right the three GPU modules on the aluminum heat sinks, on the back of the frame on the far right a part of the water pipe with connector.

By dispensing with housings for servers and power supply units, Diggers is following the principles of the Open Compute Project, which also works on the economical use of resources and dispenses with the nesting of housings. The idea of using few fans with a large radius has already proven to be energy-saving with OCP hardware. Diggers is also working on a model with rack-mounted power supply units that supply the servers with 12 volts via power rails – also an OCP concept.

Diggers aims to be flexible when it comes to cabinet formats. In addition to the half-height cabinets in its very own format, which are in operation at Exoscale both as demonstration systems and as evaluation models, the manufacturer currently also offers 19-inch models. Further formats will be developed on request.

Videos by heise

Exoscale, hoster, cloud provider and subsidiary of the Austrian A1 Digital, is currently evaluating the system in its data center in Vienna Floridsdorf. An entire rack row of these systems is working there in a room without a raised floor and data center air conditioning, connected to the outside world only via mains, electricity, and water pipes. Exoscale confirms the savings figures quoted by Diggers.

Meanwhile, further feedback came from the GenLearning Center at EPFL (École Polytechnique Fédérale de Lausanne), which used the systems at Exascale for two days for an AI competition: “Everything ran extremely smoothly, and even when everyone was accessing the servers at the same time, there were no dropouts. It's really the first time I've experienced a workshop with models that run so smoothly at the end. The stability and performance of Exoscale's GPUs allowed over 50 participants to seamlessly deploy and experiment with state-of-the-art LLMs without any bottlenecks,” said Assistant Professor and Co-Head of the GenLearning Center Andrei Kucharavy.

(sun)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.