We're Hiring!
Take the next step in your career and work on diverse technology projects with cross-functional teams.
Mountain West Farm Bureau Insurance
office workers empowered by business technology solutions

Think Inside the Box & Implement Containment for Energy Savings

Last updated:
No items found.
data center aisle containment pod

Airflow containment refers to the practice of segregating the aisles of a data center so the hot exhaust air from servers does not mix with incoming cold air, while also more efficiently directing airflow into or out of the data center floor. According to the Uptime Institute’s 2014 Data Center Industry Survey, only 30% of operators have at least ¾ of their data center using some form of containment. Less than half of all survey respondents had at least 50% of their data center heat contained.

That leaves a lot of white space without any form of containment, which is one of the best ways to improve energy efficiency and translates into a more reliable environment as well as direct cost savings.

Things have improved since a few years ago, to be sure. But airflow containment remains a significant upfront investment that data center operations teams might not consider, especially at smaller providers or in-house facilities. However it can show a real ROI.

Green House Data is currently evaluating containment in our Seattle, WA data center, which is inside the Westin Building Exchange. In all likelihood we’ll have it installed on at least one floor this year, and expect to save enough energy to pay for the equipment and installation costs within a couple of years. That seems like a while, but after that period all the energy we save is essentially profit.


Why Contain?

ASHRAE has increased the recommended data center floor temperature, but very high temps (over 100 degrees Fahrenheit) can often lead to equipment failure. Overcooling, on the other hand, is very expensive when you’re pumping air conditioned air through tens of thousands of square feet.

In addition to improving cooling efficiency by up to 40%, containment also eliminates “hot spots” in the data center, or at least minimizes them. Hot spots are areas where hot air pools, causing equipment problems. Containment allows economizers or free cooling equipment for more hours per year, while cooling systems can also operate above the dewpoint temperature, reducing the use of humidifiers or dehumidifiers on the floor.


What Containment Options Exist?

Aisles can have partial containment or full containment as well as cold aisle or hot aisle containment. A partial containment option might be as simple as adding plastic flaps to the end of your aisles. Even this small step can have an impact on efficiency. With a partial containment solution, airflow can still get around the edges and tops of your aisles, but each aisle has its hot exhaust air facing another hot aisle.

Full containment involves sealing off the entire cabinet with doors at the end and barriers blocking the top of the cabinet to the ceiling. This is a less flexible option, so you’ll need to have a highly designed environment and plan ahead, but it is far more efficient.

Cold aisle containment is less efficient than hot aisle. It seals off the cold aisle, where incoming air conditioned air enters the servers. The rest of the data center floor, which is open, is the temperature of the hot exhaust air. Hot aisle containment is the opposite, with the exhaust heat trapped inside the contained area and pumped from there outside the data center floor.

When evaluating containment, calculate the ratio of cooling capacity from your cooling equipment to the estimated heat load from the full data center floor (be sure to project out to completely filled cabinets, not just your initial deployment). This cooling capacity ratio can be greatly impacted by modifying the air plenums (whether perforated floor tiles or overhead), encouraging neat cabling practices, sealing empty rack spaces, reducing the airflow rate, and increasing temperatures.

Play around with all of these factors while modeling containment to find a comfortable area where you are not overcooling, not pushing your servers too close to a failure point, and operating as efficiently as possible.

Arthur Salazar, Green House Data

Posted by Director of Data Centers & Compliance
Art Salazar

Recent Blog Posts

lunavi logo alternate white and yellow
Service Changes Coming to Microsoft 365 & Office 365

The NCE offers new subscription terms including 12-month and 36-month plans priced lower than monthly contracts. In addition, it is easier to add seats, cancellation policies are more consistent, and there are two promotional options to lock in a better rate for your current renewal. However, the mandatory new plans do include price adjustments.

Learn more
lunavi logo alternate white and yellow
Automate Your Cloud with Azure Bicep

Azure Bicep is a great way to implement Infrastructure as a Code to automate the provisioning of Azure resources. In this post, I’ll get you started by describing how Bicep language works as well as key differences and similarities between Bicep and ARM Templates.

Learn more
lunavi logo alternate white and yellow
Lunavi Response to Log4j Vulnerability

The log4j vulnerability is affecting many Apache systems. Learn how Lunavi is responding to this ongoing threat.

Learn more