Some innovative university researchers are focusing on cutting the cost of cooling the hot racks of servers in data centers. Last month, to create one of the world's most efficient data centers on the school's campus, while the Georgia Institute of Technology where researchers can test new cooling designs and measure the impact that the designs have on power efficiency.
[ For more data center news and expert advice on data center strategy, see CIO.com's section. ]
The Georgia Tech researchers aim to analyze power consumption "all the way from the chip to the data center facility," says Yogendra Joshi, a professor of mechanical engineering at the university.
"We are addressing the inefficiencies at all scales," Joshi says. "Some researchers are looking at cooling at the chip level, some are looking at the cabinet level, and some are looking at the facilities level."
Two major trends in the data center sector are driving the interest in cooling. As the demand for data centers continues to rise, despite the down economy, Moore's Law-the prediction that processors will become twice as powerful every 18 months to 2 years-means that data centers will produce more heat. However, companies looking to build new data centers are finding resources increasingly scarce. Power is more expensive, and water for cooling is harder to come by.