An interview with CAREL’s Vice President Luigi Nalini. This is Part 2 if a three-part series.

LN Data processing systems, which require a significant capital investment, are essential for a company’s operations and their correct functioning is of the utmost importance.

LN Data processing systems, which require a significant capital investment, are essential for a company’s operations and their correct functioning is of the utmost importance. Supplied by CAREL

On 31 May CAREL participated at DCN in Milan, an event offering the opportunity to meet the main operators in the sector and discuss technologies, innovation, and energy saving.

…continued from Part 1.

LN The 1970s saw the rise of mainframes, computers in metal cabinets that were installed in special rooms (the leader at the time was IBM).

Then came the first terminals with keyboards and videos as interfaces with the mainframe that processed the data and downloaded the results in the form of paper printouts and – as static memories did not yet exist – on magnetic media.

Magnetic tapes, multiple disk drives and, up to the mid-1970s, drums (rotating cylinders coated with magnetic material) had one or more magnetic heads capable of writing or reading on the surface of the storage media.

With the miniaturisation of microprocessors, the density of heat emitted by chipsets began to grow rapidly, thus making it necessary to upgrade computer room air conditioning systems.

One especially critical aspect was the control of relative humidity, which is important for magnetic storage and printers, as well as for the mainframe itself.

Low relative humidity, together with friction with the cooling air flow, was a cause of harmful static discharges for integrated circuits.

Furthermore, constant temperature and humidity were needed to allow the safe operation of magnetic media: the information density per surface area was quite low, and in order to achieve a sufficient read/write speed, the devices moved very quickly; meanwhile, the distance between the heads and the surface of the magnetic material was minimal compared to the size of the disks and drums.

Significant deviations in temperature-humidity conditions caused dimensional variations that led the devices to crash.

Finally, incorrect relative humidity inhibited the proper functioning of high-speed roller printers, by creating high-voltage electrical discharges or paper jams.

EB Very interesting; it’s clear then why for this application the term mission critical is often used and has over time replaced the term precision air conditioning; in fact, in the past we referred to CCU or close control units, while nowadays the most widely-used term is CRAC/H, computer room air conditioner/handler. How did this term come about?

LN Mainframes heated the air, while manufacturers specified quite stringent operating temperature and humidity conditions, probably with safety margins that were far greater than necessary.

However, at the beginning the heat emitted by mainframes was in the order of a few tens of W/m2 of occupied surface area, while the typical computer rooms were 3-400 m2 and therefore there was no need for specialised air handling systems or solutions.

The increase in computing power and the miniaturisation of electronic components quickly saw this value exceed 2-300 W/m2. Consequently, room units were developed in the USA for use on mainframes.

As thermal density continued to increase, it became more difficult to deliver cooling air to all the equipment. Raised floors, until then used for laying the cables, increasingly began to be used as a plenum to distribute cooled air, from so-called under and then later downflow units, as these delivered the temperature-controlled cooled air downwards, into the raised floor. The specification was that the air in contact with the chips needed to be around 22°C, with limited fluctuations, as it was believed that temperature gradients could lead to microcracks in the electronic circuits.

EB This design remained unvaried for many years; for some time now, however, also thanks to the spread of layouts that separate the air entering and the air leaving the computers, and the work done by ASHRAE Technical Committee 9.9, the specified air operating conditions have been considerably loosened, with temperatures ranging from 28 to 27 degrees and humidity up to 60% and higher. This has paved the way for solutions based on free cooling. In your opinion, why did it take so long for this awareness to come about?

LN Data processing systems, which require a significant capital investment, are essential for a company’s operations and their correct functioning is of the utmost importance.

For quite some time, computer centre management was quite demanding, to the extent where technical issues and maintenance at the user’s site were entrusted to technical personnel employed by the mainframe manufacturer, with understandably high costs.

Consequently, energy consumption was far less important than overall reliability, and the incidence of the energy bill on the total cost of data centre ownership was very low.

However, over time the cost of servers started coming down, while the cost of energy has risen so much that running costs over a system’s life cycle are often not much lower than the capital investment: indeed, this would be an interesting metric to implement.

Continued in Part 3…

Source: CAREL website