As servers become more sophisticated and generate more heat, the data center liquid cooling industry is set to surpass $3 billion by 2026, according to a new research report by Global Market Insights. The matter of overheating becomes a major point of concern when it results it power outages and unplanned downtime. This has made efficient, reliable cooling, power, support systems extremely crucial.
As power densities of racks rapidly increase due to limited space, cooling efficiency is top of mind for companies in the industry. Cooling systems are very energy intensive, using as much (or more) energy as the computers they are created for, and they are expected to run continuously with the same effectiveness. Cooling strategies can be optimized by using computational fluid dynamics (CFD) simulation, which enables engineers to virtually test different data center designs, gain insights into airflow patterns, and discover “hot spots.”
Exactly how do data center owners and operators manage their cabling when their servers are immersed in liquid coolant?
Immersion cooling manufacturers understand the need for cable management when designing their cooling racks. Because the servers themselves are completely immersed, the cable connections are also made under the surface of the coolant. While the liquid used to cool sensitive IT equipment is non-conductive and non-flammable, it can damage certain types of cabling. Some PVC cable jackets stiffen over time from being immersed in the liquid coolant. Data center operators may choose to continue to use low-cost cabling with PVC jackets, like Ethernet cables, and simply swap them out when they stiffen. Otherwise, it’s best to use cables with synthetic rubber cladding, which the liquid coolant does not affect.
Row-level airflow management refers to improving cold aisle and hot aisle separation. It’s typically done once you’ve made improvements at the rack level (e.g. blanking panels) and raised floor level (e.g. brush grommets). When we talk to data center operators about improving airflow efficiency at the row, they’ll jump ahead to containment a little too quickly. The fact is, there are several areas in the row that can be addressed without engaging in a full-blown containment initiative.
For site owners at Green Mountain, the transition from ammunition storage facility to data center was not without challenges. They were faced with confined spaces, existing structures, and the need to install a reliable piping system to cool the server racks. But despite the challenges, it also offered many benefits, including the cold Norwegian climate to keep data cool at lower costs and the vast supply of hydroelectric power.
Airflow containment is an effective strategy for dealing with increasing cabinet densities while maintaining energy efficiency and keeping operating costs down in data centers.
Data center operators and facility managers continuously work to ensure temperatures remain consistent without raising energy bills. With so many options on the market, it can be hard to decide which one is the best. Beating data center heat is possible, though, and here are four ways to do it: Employ regularly scheduled maintenance, optimize server racks for cooling efficiency, rethink your data center architecture, and increase data center temperatures.
Energy costs related to cooling account for approximately 37% of the overall data center power consumption and are one of the fastest rising data center operational costs! Join Panduit September 25, 10am CST to learn how effective airflow management products can reduce energy costs.
Data centers, whether they are multi-story hyperscale facilities in excess of 10 MW or small facilities designed to reduce latency at the edge of the network, require a means to reject the heat generated by the servers. The methods for achieving this are changing along with the industry as the demand for efficiency and sustainability continues to increase. Cooling methods may vary depending on facility size and location, but one thing remains constant: reliable thermal management is a mission critical requirement.
Not only are cooling costs for data centers already red hot and rising with energy prices, they are impacted by continual changes in IT equipment and racks. According to ENERGY STAR, the energy required for a data center’s cooling is 10 times greater than that required for other buildings regardless of climate region. This webinar will provide tips on reducing cooling costs in both free-standing data centers and in-building centers.