It may take a leap to believe that computers can function in fluid , but at the Texas Advanced Computing centre in Austin that’s exactly what’s being done.
Picture a 42U server rack (each U is 1.75 inches) tilted on its side in a fluid filled tank — something akin to a large freezer — and you have a general idea of the scene.
The Texas computing centre has been testing servers in a tank that are fully immersed in mineral oil, a non-conducting fluid. The technology for doing so was developed by a neighboring Austin firm, Green Revolution Cooling.
Related story: Using renewable energy in the data centre
Mark Tlapak, co-founder of two year-old Austin-based Green Revolution, describes the fluid as baby oil without the fragrance, as well as non-toxic, safe and something that works well with electronics.
In this pilot, fans have been removed from the servers and the disk drives have been encapsulated to keep fluid from getting inside them. But otherwise these are standard industry systems, Dell machines in this case, that have been submerged in fluid.
“We get better operating efficiency — it greatly reduces the amount of power,” said Dan Stanzione, deputy director of the supercomputing centre.
The mineral oil is at 105 degrees Fahrenheit, which keeps processors and the disk drives running at about 115 degrees, within range of their normal operating temperatures.
Air cooling takes a lot more energy. To get the air temperature down to 70 degrees Fahrenheit requires getting the chiller cooling fluid temperature at about 45 degrees. The chips are still running at about 115 degrees. “Air is not very effective mechanism for conducing heat away,” said Stanzione.
The Texas computing centre, which is based at the University of Texas, is now considering immersing a couple of dense racks that are running in production. It is assessing the long-term effects of mineral oil cooling on component reliability and disk drive performance.
The mineral oil may be acting as a shock absorber for the disk drives, said Stanzione, who said that it is believed most disk drives suffer some performance penalty due to vibration that forces them to re-read or rewrite data.
The mineral oil is still a stop-gap measure to the overall operating cost of running a data centre . “Something is going to have to give in the silicon space — we can’t keep increasing power forever,” said Stanzione.
The Texas computing system has a three-year-old supercomputer, Ranger, which has 15,744 quad-core processors and uses about 2.5 megawatts of power. The computer’s power bill, not including cooling, is more than $1 million a year. Cooling adds about 30% to that cost.
The power usage effectiveness (PUE) for Ranger is about 1.3. — meaning that for every $1 it spends on running the computer it spends about 30 cents on cooling. PUE is a metric based on total facility power — including the cooling system, UPS and lighting — divided by the power used by the IT equipment, including servers, networking and storage hardware.
A PUE of 1.0 means that no power is used for cooling and power distribution, and Stanzione says he can get close to that mark with the mineral oil. They are cooling an entire 42U rack with about 250 watts of power at the moment in the prototype system, he said.
Green Revolution isn’t disclosing the prices of its system, and its cost depends on whether they are being installed in a new data centre or retrofitted, said Tlapak. A retrofit system has a longer payback, within three years, he said.
Ranger cost $30 million and is expected to be in operation for about four years. The computing centre may spend a similar amount on its next system, but the system performance gains will mean increase power usage and Stanzione estimates that the next system cost $3 to $4 million in power.
“That’s not a curve we can stay on forever,” Stanzione said.
He believes that to reduce the power curve something will have to change; either the fundamental technology in which transistors are built, the architectures in which computers are built, or better ways to build software will be needed.