By Eamonn Ryan

Despite data centres’ growing ubiquity, these facilities remain shrouded in secrecy.

ASHRAE temperature guidelines showing the recommended andallowable ranges.

ASHRAE temperature guidelines showing the recommended and allowable ranges. Image by ASHRAE

Data centres are increasingly becoming the backbone of the modern digital economy, powering everything from cloud computing to artificial intelligence, e-commerce and cryptocurrency mining.

One of the most striking aspects of data centres is the level of secrecy surrounding their operations. The physical spaces that house these critical infrastructures are often completely closed off and unbranded, with only a handful of authorised individuals allowed access to certain areas. A striking example described by an industry professional was a brand-new data centre recently visited. Inside, entire sections of the facility were sealed off, with everything from HVAC systems to fire suppression equipment located outside the sealed section.

Even the data centre owners were kept in the dark about the contents of these sealed areas. The servers, for instance, arrived fully covered in black cloths, making it impossible to discern what was inside. The data centre itself remains focused solely on maintaining the ideal environment for its clients’ operations—temperature, humidity, power supply and security—while having no insight into what those clients actually do within the space.

 

The rise of Bitcoin minin

Bitcoin mining has become an increasingly dominant use case for data centres, due to its immense energy demands. While traditional data centres host servers for businesses ranging from financial institutions to cloud service providers, cryptocurrency miners need enormous amounts of computing power to solve complex algorithms and ‘mine’ new Bitcoin. These operations require specialised equipment, making them an attractive but highly secretive tenant for data centre providers.

According to an industry presentation, it is estimated that by 2035, data centres could consume 14% of the world’s electricity.[1] This highlights the tremendous scale of power required to keep these operations running 24/7, particularly as more industries and governments shift to cloud computing, AI, and other data-intensive technologies.

In regions like South Africa, the construction of mega data centres is on the rise, but the benefits to the local economy can be unclear. While companies involved in the supply of high-quality components—such as cooling systems, control valves and air handlers—stand to benefit, the majority of the design and manufacturing for large-scale data centres is outsourced overseas. This means that while local suppliers may gain contracts, the overall value added to the local market is limited.

 

Energy-saving solutions for data centres

At a recent AERSA CPD points training session, Christian Preato, AERMEC’s export area manager for African countries, shared valuable insights on energy-saving solutions specifically tailored for data centres.

“Even if we are able to save 5-10% of the electricity bill, over time it becomes a substantial sum.”

He noted that CIBSE/ASHRAE guidelines had introduced new allowable ranges for temperatures and humidity control within data centres – intended to remove obstacles to new cooling strategies. “In the past 5-10 years the limits had been widened to higher temperatures of the return air to 30°C or even 35°C. The main advantage is that we can use routine procedures for the HVAC units on the computers within the data centres for many more hours during the day.”

He described three main types of data centre HVAC system, starting with (though not necessarily the most efficient) a chilled water system.

Chilled water system

These are good for large capacity data centres to produce cooling at high EER (energy efficiency ratio), requiring low maintenance, minimised on the data centre floor with computer room air conditioning (CRAC) units. There are several energy saving opportunities including high water temperature production and external free cooling. The plant is centralised, using less refrigerant.

On the negative side, more time is required for engineering (requiring greater expertise) even a small data centre because a hydraulic system also needs to be designed as they require more infrastructure with water circulating in a data centre for instance needing leak control.

 

Direct expansion (DX) system

Units are sized to match the load, thereby improving efficiency with new compressor technology and controllers. It permits easy installation of pipework due to a modular design and low cost. However, Preato notes that this system is impractical for large data centres; has a potentially higher energy consumption especially at larger cooling capacities; and is also affected by a need for leak detection and flammability of the eco-friendly refrigerants. “It uses far more refrigerant, and it is inside the building.”

Indirect cooling system

“This is less used but is highly efficient because you use an external unit that exchanges the heat via external air, creating free cooling. Adiabatic cooling is used to support the cooling process at higher ambient temperatures. Mechanical cooling may be supplied as a top-up beyond the standard cooling process.”

He lists the pros as: enabling modular solutions for a large data centre; a low achievable pPUE (partial power usage effectiveness) due to adiabatic cooling operations; and a clean room environment without external pollution. On the other hand, he says they require increased maintenance for water treatment, filters, fans and pumps; but primarily they are impractical where there are space restrictions. They are also sensitive to climate profiles, while water is required to be stored due to the high water consumption.

“Usually, one would run two internal units at 100% capacity and have one on standby, but an option is rather to run all three at 66% capacity with the benefit of improved air distribution within the data centre, thereby reducing hot spots. The system also works more efficiently. Operating at partial load can optimise performance, allowing facilities to reduce energy usage while still maintaining effective cooling – especially if they are inverter compressors and fans,” he says.

Case studies presented during the session illustrated these strategies’ effectiveness, revealing that a 42% reduction in energy consumption was achieved by running four 145 kW cooling units at partial load, as compared to a plant running three such units with one on standby.

Christian Preato, AERMEC's export areamanager for African
countries.

Christian Preato, AERMEC’s export area manager for African
countries. © RACA Journal

Options for chiller efficiency EER improvements

EER is the ratio between output cooling energy and input electrical energy. The single most important method to improve the ratio is adjusting water temperature settings higher, whereby data centres can leverage existing systems more efficiently, minimising energy waste. Implementing dynamic or variable set points can help adapt to varying cooling needs, ensuring that systems operate only as much as necessary.

The new generation of chillers designed for data centres supply water up to 30°C (compared with 15-20°C previously) with external temperatures up to 46°C. Free-cooling and glycol-free configurations can enhance the energy savings. This means in many countries free cooling can be used year-round as it kicks in when external air temperature is 2°C lower than the water return temperature. Even in South Africa temperatures are below 30°C 70% to 80% of the time. A rise of 5°C produces a 10% improvement in EER, which Preato describes as “a big improvement for a chiller”.

Moreover, Preato underscored the advantages of increased Delta T (the difference between two temperatures). It increases cooling operation, reduces pumping energy costs and reduces piping dimensions and water.

In terms of free cooling regulations, Preato noted that “the condensing pressure regulation is done excluding the condensing coil’s sections through proper two-way valves, thereby reducing the exchange area without slowing down the fan speed”.

 

Challenges and Considerations

Preato acknowledged the complexities involved in designing efficient data centre systems. A customised design must consider variable loads and all-year-round operation, evaluate climate profiles and cooling loads.

He discussed the challenges associated with different cooling technologies, including the efficiency of chilled water systems, potential hazards with flammable refrigerants in PX systems, and the limitations of space in air-cooled systems.

Aermec’s energy simulation software (ACES) considers the current and future installation site design, the required cooling load, the climatic profile, the use of the facility, potential renewable energy integrations, the available space, noise constraints, redundancy, maintenance and more. “All these inputs are applied in an interactive analysis process which calculates the best possible solution considering all the involved variables, to generate a customised and optimised proposal. All the benefits of the new cooling technologies can be maximised using the proper system control.  The key to the new generation controller is its logic based on the chiller’s actual working data: the controller features have been developed using extensive laboratory data obtained in real simulation and the sampling points are gathered, mapped and transformed into a simulation model which characterises all the real possible working conditions for chillers.”

Preato emphasises the importance of rigorous testing in collaboration with industry partners like Equinix, one of the largest data centre providers globally. Before deployment, chillers are extensively tested in controlled environments to ensure they meet performance specifications. This ensures reliability and capacity, critical factors for data centres where cooling failures can lead to severe operational disruptions.

“The scale of these systems is impressive, often featuring multiple chillers with capacities exceeding 10MW. Most installations utilise advanced, high-temperature chillers to maximize efficiency and effectiveness in their cooling solutions.”

In Africa, data centres often face constraints related to water supply and quality, making the implementation of water-cooled systems a topic that requires careful planning and consideration. The discussion acknowledges that while it is possible to have water-cooled systems in these regions, they demand comprehensive studies and strategic designs to ensure they can function effectively under local conditions.

Preato emphasises the balance that must be struck between efficiency and system complexity. “For example, while adding a secondary heat exchanger can provide benefits, it also complicates the system and can reduce overall efficiency. In making decisions, data centre operators must weigh these factors against their specific operational needs and constraints.”

The size of the data centre also plays a critical role in determining the most suitable cooling solution. Smaller data centres may require simpler solutions, while larger installations can justify the investment in more complex systems if they promise greater efficiencies.

“Some businesses may prefer straightforward systems due to budget constraints or operational simplicity, while others may be willing to invest in advanced solutions for greater efficiency. Understanding these preferences is crucial for manufacturers and consultants when proposing solutions. There is no one-size-fits-all solution for data centre cooling. Each location and project presents its own challenges and opportunities, making strategic decision-making essential. The goal is to optimise the cooling solution for each specific scenario, considering factors like climate, available resources, and operational demands,” he concludes.

 

EBM Papst explores data centre market growth at Pan African Data Centre Conference

In an interview on the sidelines of the 2024 Pan African Data Centre Conference in Johannesburg, Francois Schoombie, technical manager at ebm papst South Africa, discussed the company’s foray into the data centre cooling sector and their unique market approach.

The company, a German-owned subsidiary renowned for its electronically commutated (EC) technology in fans, has strategically entered the data centre cooling segment due to the widespread use of its components in various cooling units. “Our fans are integral components in data centre cooling units across the globe. Previously, most inquiries came through our OEMs without our direct knowledge. Now, we’re seeing more inquiries directly from end users and data centre developers,” Schoombie says.

“Many end users are unaware that our fans are part of their cooling solutions. It’s crucial for us to establish our footprint and ensure that our support services are readily accessible,” he adds.

The EC technology is known for its energy efficiency compared to conventional AC motors, giving it a significant advantage in markets with stringent energy standards like Europe. “In South Africa, where such regulations are still evolving, our technology also offers substantial efficiency gains,” Schoombie explains.

Commenting on the data centre market in Africa, Schoombie expresses optimism about its growth potential driven by technological advancements and infrastructure developments. “South Africa is poised as a hub for data centre expansion in Africa, supported by increased connectivity and technological advancements.”

With a diversified product portfolio spanning industries from automotive to refrigeration, ebm papst sees the data centre sector as a strategic growth area. Schoombie elaborates on the expanding demand within the data centre sector, emphasising its significant growth potential. “It’s a huge potential going forward,” he explains.

Despite the critical role of cooling systems in data centres, Schoombie highlights a gap in understanding of cooling among data centre developers. “Unfortunately, they don’t know enough about the subject of cooling. Many are aware of their units’ functions but lack detailed knowledge of cooling specifics.” This gap motivates ebm papst to raise awareness and educate local stakeholders about the importance of effective cooling solutions, as well as the benefits of AC fans.

Schoombie emphasises the necessity for tailored mechanical designs to manage increasing heat loads. “Data centres require different cooling designs to dissipate heat efficiently. Our fans are versatile, capable of supporting various airflow configurations crucial for heat dissipation.”

Regarding retrofitting older data centres for improved cooling efficiency, Schoombie notes that while retrofits are less common due to industry reliance on lifecycle maintenance, emb papst has been involved in retrofit projects with OEMs. “Retrofitting involves proactive maintenance strategies or fan replacements. This ensures ongoing operational efficiency without major disruptions.”

Considering this critical need for uninterrupted service in data centres, Schoombie notes the meticulous planning required in maintenance activities. “Major data centres coordinate downtime for maintenance to minimise disruptions. This coordination is essential for ensuring continuous cooling and operational integrity.”