Pages

March 01, 2011

SuperMUC is the next hot water cooled IBM supercomputer

At this year's CeBIT, IBM is presenting its first so-called hot-water-cooled systems, which will provide a sneak preview of future innovations: Supercomputers the size of sugar cubes.

The next hot-water-cooled IBM system is already on the drawing board, this time in Germany. It will be significantly larger than Aquasar and is expected to go into operation at the Leibniz Supercomputing Centre (LRZ) in Munich, Germany, by 2012. Called SuperMUC, this new computer will be part of the Partnership for Advanced Computing in Europe (PRACE) HPC infrastructure and made available to scientists and research institutes throughout Europe. The system has a peak performance of 3 petaflop/s (10^15 arithmetic operations per second) and is based on an IBM System X iDataPlex ®, which contains more than 14,000 Intel Xeon next-generation processors. SuperMUC will be more powerful than 110,000 PCs, enabling LRZ scientists to verify theories, develop experiments and predict results to an unprecedented extent—all this, while still requiring massively less energy.



Future innovation: 3D integration

Looking further ahead, so-called 3D chips promise even higher performance with lower energy consumption. Paving the way for exascale computers, IBM scientists are pursuing extensive research on 3D integration. 3D chip architectures, in which processors are stacked on top of each other, not only reduce the surface area of the chip but also shorten the communication distance between the chips and increase the bandwidth for data transmission on the chip many times.

One of the main limitations in developing 3D chip layouts currently lies in the performance of conventional coolers. More complex designs with extremely thin, stacked processors can reach power densities of up to 5 kW/cm3 (kilowatts per cubic centimeter)—a power density which exceeds that of any current heat engine, such as internal combustion engines, by ten times.

At IBM Research – Zurich, novel concepts to scale cooling technologies for 3D chip stacks are being explored. In test systems, water is piped directly between the individual chip layers through microscopic channels measuring only about 50 microns. Such designs allow 3D stacks of heating elements to be cooled very efficiently with the heat fluxes released by today's processors.

Before first fully functional prototypes can be realized, which is expected to happen in the next seven to ten years, researchers must still overcome several technical hurdles. Their aim is to develop a system with an optimized flow of water through the thin layers, which at the same time reliably isolates the electronics from the water. A special difficulty is posed by the thousands of electronic connections that run vertically through the chip stack. In fact, the density of the components in such a system would be comparable to that of the human brain, which is intersected by millions of nerve fibers for signal processing, and features tens of thousands of blood capillaries for nutrients and heat transfer—without interfering with each other.

The three-dimensional integration of computer chips is one of the most promising approaches to boost performance tremendously while reducing energy consumption considerably. Supercomputers as small as sugar cubes could thus become reality.

CMOSAIC 3d Chip Cooling

A team of IBM researchers collaborating with two Swiss university partners aims to extend Moore's Law another 15 years by using 3-D stack architectures with liquid cooling microchannels.

IBM, École Polytechnique Fédérale de Lausanne (EPFL) and the Swiss Federal Institute of Technology Zurich (ETH) have launched a joint four-year project called CMOSAIC, which will investigate how the latest chip cooling techniques can support 3D chip architectures. The project will examine a 3D multi-core stack architecture with a interconnect density ranging from 100 to 10,000 connections per square millimetre


If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks
blog comments powered by Disqus