The Logical Data Center​

Energy Saving
Energy Symbol Circle Green
Carbon Reduction
Carbon Co2 Circle Green
Real Estate Increase
Real Estate Circle Green
Zero Water Use
Water droplet Circle Green
Operational Savings
Cog Operational Circle Green
Simple Design
Simplicity Click Finger Circle Green
Modular & Retrofittable
Modular and Retrofit Circle Green
yesterday
Yesterday’s Thinking for Todays Data Centers

Nobody invites change, especially when their existing business model works and is profitable, so for change to occur, there are normally key drivers. The data center industry is no different and the need for change is beginning to accelerate.

Data center owners are being made accountable for their environmental impact, poorly designed data centers simply create massive carbon emissions, waste vast amounts of valuable potable water and continue to use ozone depleting refrigerants. Because emotion alone doesn’t always create change, enforcement via legislation by governments, alongside demands from customers and stakeholders, is starting to be introduced to help combat high carbon emissions and encourage sustainability.

Technology changes are also a key driver in data center utilization, and the introduction of new chip sets, with their leap in compute power and the resulting increase in heat load, will leave many air moving data center designs, even with bolt on improvements simply unable to cope.

But, it will be the customers who will chose to use eco-friendly data centers, that will prove to be the tipping point – The good news is, change is a coming!

vs

modern
The Modern Data Center Approach

A modern data center design should be able to meet each one of these drivers, and at the same time offer more. A ColdLogik data center does exactly that. A data center that is robust, adaptable and ready to control the whole room temperature from this one single solution in a sustainable and more profitable way.

Capable of low, medium and high duty using the same technology and control philosophy – it’s not genius, it’s simply ColdLogik!

The Range

CL21 Smart Passive

CL21 Passive 47u 600 x 1200 5210 LIGHT GREY ISO cropped
CL21 Passive 47u 600 x 1200 5210 LIGHT GREY Face On copped
Range:

Up to 29kW sensible cooling 42U, 47U, 52U
600w, 750w, 800w

Datasheet

CL21 ProActive

CL21 47u 600 x 1200 5210 ISO LIGHT GREY cropped 1
CL21 47u Top Fed Active Front View cropped 1
Range:

Up to 30kW sensible cooling 42U, 47U, 52U
600w, 750w, 800w

Datasheet

CL20 ProActive

CL20 42u 600 x 1200 LIGHT GREY cropped
CL20 RDC RC 42u 600w Front View cropped
Range:

Up to 92kW sensible cooling 42U, 48U, 52U
600w, 750w, 800w

Datasheet

CL23 HPC ProActive

CL23 HPC CL23 48u 800 x 1200 5210 ISO LIGHT GREY1 cropped 1
CL23 HPC 48u 800w HPC 200 Door and Cab FRONT.37 with logo cropped 1
Range:

Up to 200kW (Actual headroom 220kW+) sensible cooling
48U, 52U
800w

Datasheet

How the ColdLogik Solution Works

ProActive RDC (RDHX)

The multi award winning ColdLogik Rear Door method for cooling is best described by the term ‘Air Assisted Liquid Cooling’ or AALC for short.

AALC allows for the best of both worlds, enabling higher densities in standard data center designs and, bringing levels of efficiency that are truly capable of enabling change in your next retrofit or new build project.

Ambient air is drawn into the rack via the IT equipment fans. The hot exhaust air is expelled from the equipment and pulled over the heat exchanger assisted by EC fans mounted in the RDC chassis. the exhaust heat transfers into the cooling fluid within the heat exchanger, and the newly chilled air is expelled into the room at, or just below, the predetermined room ambient temperature designed around sensible cooling.

Both processes are managed by the ColdLogik adaptive intelligence present in every ProActive RDC, in this way the Rear Door Cooler uses air assisted liquid-cooling to control the whole room temperature automatically at is most efficient point.

CL21 ProActive RDC (RDHX)
Screenshot 2021 11 09 at 08.45.40

Resiliency Features

  • Up to N+4 on fans
  • A and B power feeds (ATS) as standard
  • Specifically designed PCB’s for enhanced functionality in the event of failure
  • Hot swappable fans
  • Universal fans reducing the need for localized resiliency stock
  • Leak prevention system available for deployment if requested

Smart Passive RDC (RDHX)

The passive RDC operates in the same way, but because it has no controller or fans, it relies purely on the IT equipment fans producing enough static air pressure to self cool. In the correct deployment, the passive RDC is truly the most efficient cooling method available today. If the active equipment is unable to produce sufficient air, the CL21 Smart Passive also provides an upgrade path without needing to replace the entire unit as with others. This means the airflow represented in the CL21 RDC above is applicable to both active and passive models.

ColdLogik Solution

ColdLogik RDC’s enhance the efficiency of most data centers without the need for any changes to their current design. However, greater energy efficiencies are achievable when the complete ColdLogik solution is deployed.

By negating the heat at source, and removing the need for air mixing or containment, you gain the ability to significantly increase the supply water temperature, which means more efficient external heat rejection options become available. In some scenarios this means that the ColdLogik RDC solution removed all compressor-based cooling therefore promoting the option to free cool all year round.

CL23 HPC RDC (RDHX)
Screenshot 2021 11 09 at 08.45.51

Hoses can be top or bottom fed

Screenshot 2021 11 09 at 08.46.09

ColdLogik Rear Door Coolers vs Other Technologies​

The ColdLogik RDC is unique in its proposition. It can efficiently service any duty range whilst offering resiliency in operation and, unrivalled flexibility when it comes to pairing with heat rejection equipment. This is all whilst negating 100% of the heat generated which is unique by comparison to other HPC deployment methods.​

Energy and Water savings - explained​

The ColdLogik RDC solution can dramatically change the efficiencies of data center environments. From year-round free cooling ability, which can cut energy consumption from cooling in the data center by over 90%. Through to higher water temperatures, in a closed loop system, which can save monumental volumes of water that can then be deployed in communities that need the resource more. The ColdLogik cooling system truly encapsulates the next step in site wide efficiency gains.​

Energy Symbol Circle Green

Uptime and reliability have always driven data center design. However trade-offs have been made in order for comfort to be achieved for both the operators and equipment. The design maintains consistency and brings down the risk of error whilst also making the environment easily manageable.

Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling. ColdLogik Rear Door Coolers allow for unlocking the true potential of free cooling in every environment whilst looking after 100% of the heat generated at room level.

One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day. Unfortunately information is scarce and so a conservative figure of 1000mW can be used across the country, this would potentially give a usage of around 68 million litres of water per day.

Energy Usage Comparison - Arizona

In the following video we will discuss the energy usage of a traditional perimeter air conditioning unit, a high efficiency perimeter air conditioning unit and USystems Rear Door Cooler in Arizona.

Energy Usage Comparison - Oslo

In the following video we will discuss the energy usage of a traditional unit, a high efficiency perimeter air conditioning unit and USystems Rear Door Cooler in Oslo.

Energy Usage Comparison - Dublin

In the following video we will discuss the energy usage of a traditional unit, a high efficiency perimeter air conditioning unit and USystems Rear Door Cooler in Dublin.

Energy Usage Comparison - Pune

In the following video we will discuss the energy usage of a traditional unit, a high efficiency perimeter air conditioning unit and USystems Rear Door Cooler in Pune.

Energy Usage Comparison - Beijing

In the following video we will discuss the energy usage of a traditional unit, a high efficiency perimeter air conditioning unit and USystems Rear Door Cooler in Beijing.

Water droplet Circle Green

Uptime and reliability have always driven data center design. However trade-offs have been made in order for comfort to be achieved for both the operators and equipment. The design maintains consistency and brings down the risk of error whilst also making the environment easily manageable.​

Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling. ColdLogik Rear Door Coolers allow for unlocking the true potential of free cooling in every environment whilst looking after 100% of the heat generated at room level.​

One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day. Information is scarce and so a very conservative figure of 1000mW can be used across the country, this would potentially give a usage of around 68 million liters of water per day.​

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of San Francisco and the Bay area its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical assistance all year round in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 7 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 5 months either. Chillers would normally remain on site in order to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to actually be run, causing an energy saving too.

SanFrancisco 3 WUE Graph

With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.

Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.

In 2014 Lawrence Berkeley National Laboratory in California issued a report that 639 billion liters of water were used in the USA alone on data center cooling, in 2020 the forecasted usage figure was predicted to be a startling 674 billion liters of water.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.

In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.

What happens when you implement a ColdLogik rear door?

SanFrancisco 3 WUE Graph
SanFrancisco 4 WUE Graph

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.

In the case of San Francisco and the Bay area its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical assistance all year round in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available.

By utilizing the ColdLogik door, on average, you would not need to use any additional water for 7 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 5 months either. Chillers would normally remain on site in order to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to actually be run, causing an energy saving too.

Conclusion

In conclusion, without considering the lower water usage across the remaining 5 month which could be substantial, the ColdLogik door would likely be able to save a minimum of 58% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year this could drop the current projected figure of 674 billion liters of water down to 283 billion liters of water which is a 391 billion liter drop. This is the equivalent of filling 156,400 Olympic swimming pools which would take up an area 1.5 times that of San Francisco city.

If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Stockholm and the Nordics, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 9 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 3 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.

Nordics 3 WUE Graph

With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.

Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.

One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day. Whilst public information is scarce a very conservative figure for water usage is around 20 million litres of water a day in the Nordics, utilised for cooling. However importantly a large proportion of data centre owners have utilised the areas climate to reduce the mechanical power requirement, which whilst increasing water usage will provide greater overall efficiency for traditional systems.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.

In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.

Nordics 1 WUE Graph
Nordics 2 WUE Graph

As you can see above the Nordic region provides a very low dry and wet bulb for a large proportion of the year, this helps with efficiency on a whole.

The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

Nordics 3 WUE Graph
Nordics 4 WUE Graph

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.

In the case of the Nordic region its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for two thirds of the year and you would also require mechanical for just under half of the year in varying load. However, as most chillers have a minimum run of 25% less free cooling could be available.

By utilizing the ColdLogik door, on average, you would not need to use any additional water for 9 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 3 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 3 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 50% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the publicly available information in the Nordic region, this could drop the current projected usage figure of 4.86 billion litres of water down to 2.43 billion litres of water which is a massive 50% drop. This is the equivalent of filling the infamous Blue Lagoon in Iceland a whopping 270 times, which really does give it perspective.

If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of London and the FLAP markets, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 8 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 4 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.

London 3 WUE Graph

With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.

Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.

One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day. Even if you only take the 13 largest data centre operations in the UK then this would equate to 58,412,000 litres of water that are wasted each day.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.

In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.

London 1 WUE Graph
London 2 WUE Graph

As someone that lives in the UK I can safely say that our weather isn’t always the best, however this gives a wonderful opportunity for eliminating excess water use.

The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

London 3 WUE Graph
London 4 WUE Graph

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.

In the case of the United Kingdom and in particular the London area its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical for over half of the year in varying load. However, as most chillers have a minimum run of 25% making less of the free cooling available.

By utilizing the ColdLogik door, on average, you would not need to use any additional water for 8 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 4 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 4 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 66% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the 13 largest publicly available data centres in the UK, this could drop the current projected usage figure of 21.32 billion litres of water down to 7.11 billion litres of water which is a 14.21 billion litre drop. This is the equivalent of filling 5550 Olympic swimming pools which would take up an area more than 130 x that which Windsor castle and it’s grounds currently occupies.

If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Bangalore and the Indian market, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not require any additional mechanical cooling on site for standard operation. This is in the form of chillers with refrigeration circuits, whilst normally these systems would remain on site in order to maintain redundancy in case of exceptional need they would not be required on a regular basis. The water usage would be less for 6 months of the year on the ColdLogik system, this would most likely account for a drop in water usage across this period of around 20%.

India 3 WUE Graph

With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.

Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.

One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day. Whilst public information is scarce a very conservative figure for water usage is around 34 million litres of water a day in the Indian market, utilised for cooling based on 500mW cooling capacity across the country.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.

In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.

India 1 WUE Graph
India 2 WUE Graph

As you can see above India provides a challenging environment for any cooling requirement, with high DB temperatures and relatively high WB temperatures to suit.

The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

India 3 WUE Graph
India 4 WUE Graph

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.

In India its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for the whole year and you would also require mechanical for the whole year in varying load. However, as most chillers have a minimum run of 25% less free cooling may be used.

By utilizing the ColdLogik door, on average, you would not require any additional mechanical cooling on site for standard operation. This is in the form of chillers with refrigeration circuits, whilst normally these systems would remain on site in order to maintain redundancy in case of exceptional need they would not be required on a regular basis. The water usage would be less for 6 months of the year on the ColdLogik system, this would most likely account for a drop in water usage across this period of around 20%.

Conclusion

In conclusion, considering the lower water usage across the 6 months, the ColdLogik door would likely be able to save a minimum of 10% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the publicly available information for India, this could drop the current projected usage figure of 12.37 billion litres of water down to 11.13 billion litres of water which is a 10% drop. In the future, as the Ashrae guidelines are pushed more into the allowable limits, the amount of water that could be saved is limitless.

If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Bangalore and the Indian market, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 6 months of the year to provide adiabatic cooling, you would only require mechanical cooling assistance for around 1-2 months. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not need to be run for 10 months of the year, causing an additional operational saving

China 3 WUE Graph

With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.

Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.

One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day. Unfortunately information is scarce and so a conservative figure of 1000mW can be used across the country, this would potentially give a usage of around 68 million litres of water per day.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.

In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.

China 1 WUE Graph
China 2 WUE Graph

As you can see above China provides a challenging environment for any cooling requirement, particularly in summer, with high DB temperatures and relatively high WB temperatures to suit.

The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

China 3 WUE Graph
China 4 WUE Graph

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.

In China its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for almost the whole year and you would also require mechanical for half the year in varying load. However, as most chillers have a minimum run of 25% less of the free cooling may be available.

By utilizing the ColdLogik door, on average, you would not need to use any additional water for 6 months of the year to provide adiabatic cooling, you would only require mechanical cooling assistance for around 1-2 months. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not need to be run for 10 months of the year, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 4 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 25% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the conservative 1tW figure, this could drop the current projected usage figure of 24.82 billion litres of water down to 18.6 billion litres of water which is a 6.2 billion litre drop. This is the equivalent of filling the Birds nest stadium in Beijing with water twice over which was the pinnacle of the 2008 Olympic games.

If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

Deployment Strategies​

This section covers USystems recommendations for deploying ColdLogik RDC's (RDHX) into both legacy and new build data centers​

Fits or retro fits into any Data Center​

  • ColdLogik RDC’s are fully retrofittable onto any OEM Rack
  • RDC’s can use the return water from the existing perimeter room cooling and chiller system
  • RDC’s can be top and bottom fed as standard
  • RDC’s do not affect baying racks on either side
  • Full rear access to rack
  • Can be retrofitted to Hot Aisle Containment System, without need to change floor layout
  • RDC’s are the only retrofittable solution capable of achieving all the restrictive issues within legacy data centers
  • Increased floorspace for Cabinet deployment
  • Flexibility on kW duty (0.1-200kW per Cabinet)

External cooling options at a glance​

ColdLogik RDC’s enable the use of transformative technology, unlocking the potential of free cooling. Giving freedom to choose the right technology to suit the local climate, which is vitally important. ​

Below you will see a range of options, detailing free cooling capability, water usage and PUE making it easy to pick what is best for your next deployment depending on location.​

Resource/Technology – In Optimum orderFree Cooling
selection
Non-free coolingWater usagePUE*
Borehole / Sea / River / Lake01.035
Cooling Tower/Adiabatic CoolerYes1.06
Dry-Air Coolers ONLY NB. Climate Dependant01.06
Hybrid ChillerYes1.15 
ColdLogik Standard Chiller (20 – 30)01.18
Standard Chiller (14-20) NB. Industry standard01.38
Standard Chiller (7-14) NB. Industry standard01.43

deployment rack

Using a natural resource such as a borehole, sea, river or lake as the heat exchange point is the ultimate “Free Cooling” solution. This has been achieved in a number of data centres using the ColdLogik RDC’s. This method also gives excellent PUE’s, and is the most efficient solution to Data Center Cooling

Adiabatic Coolers can offer efficiencies, in the right global locations and where usable water is available, that are greater than the dry air cooler because of its method for heat rejection, a reduced footprint is feasible by comparison and it can also operate without the need for mechanical cooling. These options can offer significant environmental benefits to data centers that are working toward a more efficient and sustainable future.

The Adiabatic Cooler functions by utilising the wet bulb temperature instead of the ambient temperature. Water is released to run over “pre-cooling pads” which are situated on the outside of the heat exchanger. This means that the air drawn through the wet pads, and then over the heat exchanger, is lower than the ambient temperature which provides greater heat rejection. The water/adiabatic system is only activated during periods where needed. This system is expected to operate as a dry air cooler for most of the year and makes the biggest impact in hot, dry environments, potentially consuming 80% less water than traditional evaporative systems in the process. The Cooling Tower (closed loop) also provides similar benefits to the Adiabatic Coolers. Their style and design are mainly different. Closed circuit cooling towers reject the fluid heat load into the ambient air via a heat exchanger. This isolates the process fluid from the outside air, keeping it clean and free of contamination in a closed loop.

The Dry Air Cooler (DAC) is a source of “Free Cooling”. Depending on the environment this is best used as the only external plant however, it can also be used in conjunction with a chiller to reduce the load demand. The DAC is the most efficient option in cooler environments, or when water usage and reduced maintenance are a priority. The main power consumer in this system are the fans, making this an energy efficient solution also. Fans draw relatively cool ambient air over the heat exchanger to cool the fluid/water.

Where a DAC is utilised alongside a chiller the control system will switch off the chiller(s) when they are no longer needed, reducing the operational costs and, increasing the lifespan of the system. The philosophy means that a significant reduction in Operational Expenditure makes this an attractive solution to data centres.

Hybrid Chillers incorporate a free cooling system which is integrated in to the chiller set. This can be done in two different ways, either through free cooling for the refrigerant cycle, which is more traditional, or through having a separate cooling circuit to reject the heat from process fluid to air before it enters the chiller, this gives the potential to significantly reduce the energy used by the chiller sets whilst maintaining redundancy in critical situations or extreme climates.

Chillers are among the biggest consumers of power in a data centre yet they remain widely used in data centres throughout the world for redundancy purposes. They operate on the refrigeration principle which employs compressors using refrigerant to create the cooling effect. Due to the variable cooling load demand and limitations of components and refrigerants these systems are often oversized leading to further inefficiencies in comparison to alternative methods of rejecting the heat from the white space.

Chillers do have their benefits in relation to producing lower water flow and return temperatures such as 7°C/14°C (44.6°F/57.2°F) however it is not necessary to operate at these conditions to maintain ASHRAE A1 in the white space when using the ColdLogik solution. Elevating the operational temperatures of a chiller from the industry standard of water flow and return 14°C/20°C (57.2°F/68°F) to a ColdLogik chiller standard of 20°C/30°C (68°F/86°F) reduces the physical size, power consumption and increases energy efficiency of the chiller deployment.


Low to high density cooling with one solution in one space

One of the best kept secrets is that the ColdLogik RDC isn’t just a high-density solution, because of the control philosophy, you can have both high density and low density in the same space or even next to each other without any changes being required to the white space design. The next videos will walk you through some of the infinite possibilities that the ColdLogik RDC unlocks, enabling you to pick the right cooling solution for every deployment.​


Deploying ColdLogik RDC’s in MTDC/Colocation Data Centers​

This video demonstrates the benefits of ColdLogik RDC’s deployed into MTDC/Colocation Data Centers. Topics covered include:​

  • Increased Cabinet kW duty
  • Space saving deployment
  • Operational Savings
  • Energy & Water Usage reduction

Deploying ColdLogik RDC’s in an existing Data Center​

This video demonstrates the benefits of ColdLogik RDC’s deployed into an existing Data Center​.


Shared Rear Door Cooling in a Data Center​

This video demonstrates the versatility of ColdLogik RDC’s when every two or three racks​.

Pre-Installation Services

The ColdLogik commitment in offering you get a full Data Center Health Check for optimization of efficiency and sustainability. Our Data Center experts can deliver a program that will help you improve the overall efficiency and performance of your data center. By simply allowing our technical experts to evaluate your entire data centre’s infrastructure, we can recommend the latest best practices and technologies.

As Carbon reduction is high on everyone’s agenda, one of the quickest ways to achieve targets is to ensure you are deploying the latest efficient technologies and design. This is not just relevant for new build but also retrofitting your aging data center facility.

As part of The ColdLogik Services Optimization program, the Data Center Health Check looks at a data centre's environment and determines if it is meeting the performance and efficiency goals of its operator.

For data center operators that are considering futureproofing their facility while potentially, introducing higher density practices to meet the demands of AI for instance, our Health Check program can recommend power and cooling modifications to improve efficiency and save tens of thousands of dollars per year.

Addressing Business Challenges the Health Check is designed to address common design flaws and misconceptions, delivering data and analytic reporting to overcome obstacles and challenges that can hinder performance, such as Increased energy consumption in legacy cooling architecture that can be rectified to assist in the reduction in power and water usage as well as a more efficient and intelligent Integrated building system.

As most businesses are committed to reduce Carbon emissions by 2030 or 2050, existing IT related facilities, familiarly known as Data Centers are a key starting point as significant savings can be made here.

On the surface, a data center might not be as healthy as it looks. To assess the health of a data center, our technical engineers conduct an on-site visit to determine if there are specific areas where power and cooling can be improved. We then make recommendations from this data creating a migration path to ensure best practice and improve performance.

Our Data Center specialists, with many years of combined industry knowledge, will not only provide a detailed report covering existing issues, but we will also deliver quick wins through to extensive redesign and deployment, including design, manufacture, installation, and even providing the overall project management of the fit out, as a principal contractor.

We are here to help you make informed decisions about your data center, its transition to a more efficient and sustainable facility to achieve the desired outcome while still ensuring outstanding Carbon reduction.

Contact us

side_menu