CRAC units force chilled air into a data center and around the equipment. In most cases, cooling these vast volumes of air is very inefficient. Although a hot and cold aisle containment decreases the volume, it still results in a lot of excess cooling and costs of powering the CRACs.
So why don't consider more selective data center cooling options which work together with your CRAC units or chill specific critical loads with high efficient ways but low energy cost?
Modern data centers adopt InRack and InRow coolers (IRCs), also known as close-coupled cooling systems, because they are tailor-made for high densities of hot-running IT equipment and tight energy budgets. These cooling strategies are inherently more efficient than standard CRAC systems because it ties into the IT equipment rather than sending cooled air into the room space. They may be mounted among the IT racks / cabinets or they may be mounted overhead or under the floor.
InRack Cooling
Dedicated racks, another low-effort retrofit, offer cooling isolation. The rack operates just like a standard data center rack, but it is sealed on all sides as a self-contained system. Cool air is forced up through the rack from the bottom, running over the equipment before exiting through the top to a hot plenum, where the heat is vented or recovered as necessary.
In-row cooling systems work within a row of standard server racks. The units are standard rack height, making them easy to match with the row and couple tightly to the IT equipment to ensure efficient cooling. Systems from APC by Schneider Electric, Liebert by Emerson Network Power, Rittal and others are engineered to take up the smallest footprint and offer high-density cooling. Ducting and baffles ensure that the cooling air gets where it needs to go.
Compared with the room-oriented architecture, the airflow paths are shorter and more clearly defined with the close-coupled cooling systems. Smaller fans can be used due to lower volumes of chilled air; energy costs are minimized; it is easier to target air onto high-density hot spots for preferential cooling; and business continuity improves, as the failure of any one single unit in the cooling environment only affects that rack or cabinet, as opposed to the total data center or the whole aisle containment. And, as the majority of these systems are modular, it is easy and cost-effective to build in degrees of resilience, leading to higher availability across the whole data center.
About us
Strategic Media Asia (SMA) is one of the approved CPD course providers of the Chartered Institution of Building Services Engineers (CIBSE) UK. The team exists to provide an interactive environment and opportunities for members of ICT industry and facilities' engineers to exchange professional views and experience.
SMA connects IT, Facilities and Design. For the Data Center Design Consideration, please visit
(1) Site Selection,
(2) Space Planning,
(3) Cooling,
(4) Redundancy,
(5) Fire Suppression,
(6) Meet Me Rooms,
(7) UPS Selection,
(8) Raised Floor,
(9) Code & Standards, and
All topics focus on key components and provide technical advice and recommendations for designing a data center and critical facilities.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.