Blog Archive
Tags
How to Reduce Power Consumption in Data Center Operations
For most businesses, the simplest strategies to eliminating a challenge often go unnoticed. With emerging technologies taxing IT infrastructures and deployment, older technologies are often not taken into consideration when pursuing a solution. Power consumption can fall into this category being a simple, yet overlooked section of improvement.
The fact is, "energy conservation" is becoming an industry phrase and there are a handful of very simple, easy-to-implement strategies that may result in instant savings for virtually any business. None of these methods relies on any new technology introductions. Here are five simple methods for reducing power consumption inside the data center.
Computer Room Air Conditioner
1. Hot aisle/Cold aisle If your computer equipment in the data center is not properly configured in a hot aisle/cold aisle layout, it should be. This layout of the machinery promotes proper flow of hot and cold air resulting in less work required through the HVAC air conditioning and the computer equipment. In almost all layouts, server racks are create facing one another in pairs, using the back of just one server facing the rear of another server in the next set of two server racks within the data center. Data center computer equipment is constructed to take in cool air with the front of the machine and expel the heated air the back with the machine. Setting inside the racks facing the other person in pairs promotes the flow of cooler air in the front of both server racks and the flow of warmer air between your rear of two server racks, resulting in alternating "cold" aisles and "hot" aisles. Note: All data centers use similar principles in rack layout, yet some vary in levels of wherever computer room air conditioning (CRAC) units are located. To increase the flow of heated air to HVAC return ducts, the servers needs to be placed at 90-degree angles for the CRAC units. Placing the machinery in this manner allows the warmer air circulation unobstructed on the return ducts. Be sure never to force warm air from the hot aisle to visit over cold aisles, as this may bring about heating the cold aisles. Also, note there's technology available to simulate airflow into and out of data centers that can be used prior to really installing your equipment. The hot aisle/cold aisle layout takes advantage in the properties with the cooler and warmer air to reduce work necessary for other mechanisms that depend on power consumption to receive the job done. Any time warm and cold air is able to intermingle, more work is required by power-consuming CRAC units and internal computer cooling mechanisms. Computer equipment not designed to take in cool air from your front and expel it at the rear should be placed from racks who do promote the airflow inherent in a very hot aisle/cold aisle layout. These noncompliant devices needs to be set up in a very method that directs warm air exhaust toward a hot aisle, or they needs to be put into cabinets able to redirecting top or side-ventilated exhaust for the rear with the rack.
2. Proper temperature and humidity Setting the right temperature and humidity levels in the data center is necessary to proper airflow in the room. Too often you will quickly realize there's misconception that server rooms should be cold and little monitoring is performed to keep optimum temperature and humidity levels. Many times the only sensor device inside room is normally the one located about the thermostat. Server rooms ought to be kept cool, in fact don't need to get cooler than average room temperature throughout the site. The recommended temperature range for data center rooms has always been between 67 and 72 degrees. Too often, businesses cool their data centers as a result of 65 degrees. It's okay to maintain the space a couple of degrees warmer; the computer equipment will always be operating within recommended and optimal temperature ranges.
Know this: A reduction of 4% in data center power consumption can be expected for each degree warmer the temperatures are kept. Of course, equipment has to be kept within acceptable temperature ranges and not over 75 degrees, although hardware specifications may rate the apparatus to use at temperatures of up to 95 degrees. Machinery operating outside in the acceptable temperature range wears down more quickly and poses a probability of overheating resulting in possible downtime. An additional problem is created when keeping a server room cooler than surrounding rooms: increased humidity levels.
The recommended humidity levels for the data center are between 45% and 55%. Cooling air an excessive amount of can increase humidity over acceptable levels. Once this happens, condensation can gather within sensitive, essential computer equipment and result in a hardware failure. Without the right sensors in spot to detect the temperature and humidity through the entire data center, none of the air properties might be monitored. These should be placed throughout the bedroom to ensure all products are operating within acceptable temperature and humidity ranges. Also, know there's a significant distinction between conventional air conditioners and CRAC units. Expect reduced power consumption plus much more longevity which has a CRAC unit installed.
3. Proper ceramic tiles The cold aisle should contain perforated tiles or grates to market the flow of cooler air from the floor up for the server air intake. Placing these perforations, thus raising a floor inside cold aisle, takes advantage in the inherent properties of cooler air which will reduce the job required through the HVAC system and computer cooling mechanisms. Hot aisles should not contain these floor perforations to ensure that the flow of warmer air on the air cooling return ducts remains unobstructed. Any time machinery is moved around in the data center, the alternating grates inside the cold aisles should also be moved to maintain your free flow of warmer and cooler air at the maximum.
4. Bypass air Bypass air is any conditioned air in the data center which is not used by the computer equipment air intakes. This leads to airflow inefficiencies inside the layout of the data center. Cooler air is promoted to rise from your floor through the use of perforated floor tiles or grates inside the cold aisle. Air should be prevented from rising from the floor for virtually any other purpose. Bypass air often is a result of anomalies inside floor within the data center. These could be holes cut within the floor permitting electrical or network cables leaving the area or broken floor tiles. Look for these areas about the server floor and if possible, seal them to keep bypass air to a minimum. Anytime computer equipment is relocated inside the room, make sure to check for any defects for the newly exposed floor and repair it. Also look for bypass air in the cutouts about the rear of server cabinets. Anytime cabling exits a corner in the server, the cutout must be properly sealed to avoid bypass air. Another area where bypass air may appear is round the door to the data center. Ensure this entryway is properly sealed.
5. Blanking panels Finally, from the hot aisle/cold aisle configuration, blanking panels needs to be put into server racks where there is no machinery. If blank spaces are still inside the server racks, these gaps allows hot air from your exhaust to re-enter the cold aisle, decreasing the efficiency of the whole configuration. These panels simply cover the holes at the front with the racks to close the mixing of hot and cold air. This simple fix maximizes the energy-saving potential of your hot aisle/cold aisle data center layout.
Five key inquiries to ask yourself:
1. Has reduction of power consumption in the data center been overlooked as a strategy to cut costs? Too often a data center has grown with the business therefore has its own power requirements. Power consumption inside data center needs being optimized.
2. Is the data center laid out inside proper hot aisle/cold aisle configuration? Computer equipment in many facilities is put in rows with all the front of just one rack facing the rear of another. This is definitely an inefficient layout that raises the power-consuming work of the two HVAC system along with the internal computer cooling mechanisms.
3. Is the temperature within the data center on the cheap of the acceptable operating range of data center equipment? A common misconception is server rooms needs to be cold. The truth is that data centers should be slightly warm and humid, but not very humid.
4. Is there any air entering the bedroom prohibiting natural airflow with the hot aisle/cold aisle configuration? Any holes around electrical conduits or cabling around the racks themselves need to become sealed to make certain free, maximized airflow.
5. Are enough blanking panels in position that prevent warmer air from mixing with cooler air? Blanking panels are simple, inexpensive devices that prevent warm and funky air from mixing. Implementing these simple power-consumption deterrents in the data center will result in an immediate cost savings. This savings is considerable for almost any business which solutions are attainable at really low costs. The only equipment that requires purchasing is the air monitoring sensors and blanking panels, and both of these devices are extremely inexpensive.
Energy costs are slated to carry on rising for your foreseeable future. Power grids are operating at maximum capacity and environmental concerns are preventing the long run construction of power plants. Attaining optimized power efficiency inside your clients are important to cutting unnecessary costs connected with data center power consumption. Optimization is key throughout your data center, so don't lose out by overlooking the obvious.
How to Reduce Power Consumption in Data Center Operations