"People are looking at data center efficiency, whereas five years ago it wasn't an issue," says Adam Fairbanks, Bluestone Energy, a company that retrofits old data centers to make them more energy efficient and to qualify for utility rebates (many utilities are required to help pay for data center projects that will reduce energy use; if a project can be proven to cut energy draw by 20%, the utility might pay for as much as half the cost of the project). "Today any new data center build gets scrutinized by the CFO as well as facilities and IT."
Where lowering a company's carbon footprint was a big driver for such projects a few years ago, because of the economy, environmental concerns have gotten pushed back and today they're a matter of reducing operating expense, Fairbanks says. "Money drives the majority of the projects we work on," he says.
Fairbanks shares some of the most popular methods his clients have been using to cut energy costs in a power-guzzling data center:
- Turn the thermostat up. The common wisdom around how cold a data center needs to be has changed and an ASHRAE committee has revised the upper limits of its data center temperature recommendation up to 70-77 degrees. "People have said that's conservative, and many equipment manufacturers have said that up to 90 degrees is OK for their products," Fairbanks says.However, you have to be able to manage the movement of air before you can raise temperatures, he warns. If the air is not coming through the floor properly (due to excessive wires in the way or something) or air is swirling around, you won't see efficiencies.
And you still have to cool computing equipment, even with a set point of 90 degrees. A server left running by itself uncooled would probably fry itself, Fairbanks says. "At one data center I was at recently, we did a thermal scan, where we measure and map temperatures all over the facility. One rack was at 110, which is a danger level," he says. With the proliferation of blade racks, such high cabinet temperatures are becoming more common, and there's a tendency to put all the racks in one corner of the data center, which creates one huge hot spot.
- Upgrade the HVAC. "About 30% of the power used by a data center is consumed by cooling," Fairbanks says, and the average data center is over-cooled by three or four times. A new cooling system also causes less stress on day-to-day operations than bringing in other types of new equipment. "If you put in new servers and power units, you have to rewire half the data center and move things around and it's higher risk than changing the HVAC," he says. "If you have a backup HVAC system for redundancy, you can flip over to the backup while you install the new system and achieve payback quickly."
- Use cold and hot aisles. This method of laying out a data center such that cold air used to cool computers is kept separate from the hot air they generate has been around for years, but has become more widely adopted this year.
- Try blanking panels. Server racks often have holes in the back of the cabinet, especially racks that are not full of blades. The cold air that is pushed up through the floor into these rack can escape out of the holes and into the hot aisle, causing the air conditioning system to run less efficiently. A blanking panel closes over the holes so that cold air is used exclusively to cool the servers in the racks.
- Virtualize. "There's often a conflict between the business units that own the racks and the IT staff that want to use virtualization," Fairbanks says. But here's an incentive: his company has qualified data centers for utility rebates through virtualization projects, since reduced power supplies are required for fewer servers.
- Get cooling and heating equipment to work together. Some inefficiencies are caused by CRAC units that operate independently and often fight each other, Fairbanks notes. Heating systems can conflict with air conditioning and humidifiers sometimes defeat the purpose of dehumidifiers. Bluestone offers software that has sensors and controls that monitor temperature and humidity all over a data center and aggregate information from all the units to a central point that monitors and manages all the set points. The company also provides fan trays that pull air from the floor efficiently into racks where wires or other obstacles are impeding the flow of air.
Adopted from http://www.wallstreetandtech.com