Low Hanging Green Fruits to Achieve Energy Savings in Data Centres

July 23, 2010 · 0 comments

in Energy Saving

With the advent of the Carbon Reduction Commitment as well as our long standing commitment to utilise the latest energy saving products and services Temperature Control has decided to partner with some key companies to look at energy savings in Data Centres

Over the next few months Temperature Control will be announcing many new initiatives to look at how we can harness renewable energies, improve current set up and implement new technology to reduce power consumption in Data Centres.

To see how simple some of the changes can be, below is a 10 point action plan.

1. Basic maintenance

Basic maintenance of cooling systems is often overlooked in data centres. Blocked filters, heat exchangers and loose fan belts can account for an increase in power consumption by up to 30%. On a standard CRAC system this could be up to £300 per month or over £3,000 per year.

A standard yearly maintenance could be as little £500 per year, when compared to the potential savings, it doesn’t take long to realise that this is essential.

2. Monitor performance of room

Before any improvements can be made it is essential that the current operation is monitored so you can see real-time improvements.

It also also important that any changes made are also monitored to see that the changes made are giving the improvements anticipated.

3. Increase temperature settings

One simple change that can save energy is increase the room temperature. Many times we find rooms set to 19 deg C so it feels cooler than the comfort cooled offices set at 22 deg C. Modern computers don’t need cooling to the temperatures of old and many are happy at mid 20’s.

One word of caution though, consult both your IT manager and cooling specialist, before any modifications are made as changes may effect control and alarm settings.

The temperature setting may also be used as a buffer in case the plant fails. The lower the temperature of the room, the longer the cooling can go down until the critical temperatures are reached. If this practice is used it may be more practical to install standby cooling.

4. Tidy under the floor

The data centre is built and all the cables are neatly installed in containment to help the cool air from the CRAC units to distribute evenly to the floor grilles.

Move forward 5 years. The original configuration has changed beyond recognition, the structured cabling has tripled in size and as the data centre has had to be maintained throughout these changes, the cables and additional racks have been installed in the available space.

How does this effect the cooling. If the air cant get to those hot spots, the CRAC units work harder to cool the space. By having a spring clean, removing redundant cabling and tidying up the current cables, you can help get the cold air distributed to the right places and reduce how hard your CRAC units work.

5. Re-balance the floor grilles

One other major contributor to getting the right air temperatures is the floor grilles. After years of moving them round, shutting them off as there is a cold draught it is now time to look at how the floor grilles are used.

Are they in the right place. Are you adopting a hot and cold isle configuration or just letting the cold air go into the room where ever.

Take this opportunity to plan how best to use your rack configuration and if possible change your servers to achieve a hot and cold isle regime. Once this is done the floor grilles can be moved to suit and when re-balanced to make sure the same level of air is blown through each grille, the CRAC units will run more efficiently.

6. Fill in the space around the servers

Once you have your hot and cold Isles, any warm air being drawn into your servers is not only inefficient it may also harm the server. By using simple blanking plates in the gaps between each server, it will ensure that only cold air is drawn into the server. This reduces the operating temperatures which will allow the cold air temperatures to be risen, saving energy.

7. Fill in the gaps in the floor

The more cold air we can get in the cold isles, the efficient the CRAC units will be. One of the biggest problems we face when trying to deal with this is the gap at the bottom of each rack. Many don’t see this as a problems as the “cold air goes into the rack anyway”. Good air management ensures that the air is thrown out the floor grille to a height that matches the rack. This ensures even distribution across all racks at various levels in the rack. Air introduced in the bottom of the rack only starves cold air from the servers at the top of the rack.

By filling the holes in the floor we can not only improve the efficiency of the cooling plant, we can help maintain the server life at the top of the racks.

8. Hot or cold isle containment

Taking the hot and cold isle up to the next level, cold isle containment is a proven solution to reduce energy use.

Usually only confined to new data centres, we have partnered with eCool Solutions, who offer not only installations on new sites, they can offer this on existing installations. Working in partnership Temperature Control can advise and make changes to the current cooling systems to allow the cold isle technology to be introduced.

Will this give any savings? Well just ask Yahoo UK, they saved £35,000 recently after eCool Solutions implemented their cold isle solution to their London headquarters.

9. Replace old cooling equipment

Like any technology cooling equipment is constantly improved and can offer improved over 50% more efficiency than older systems. Over the past 3 to 5 years, energy use is now the driving factor when installing new cooling plant and they are not just designed on cooling output only.

Modern data centres have free cooling using outside air, whilst looking at ways of utilising the waste heat for other uses.

This can now be adapted in existing data centres where possible and the time is now to see what equipment you have and if it needs upgrading. This may also be forced on you by the current changes in legislation with the R22 refrigerant phase out, commonly found in many data centres throughout the world.

10. Replace old servers

Leading the way in constant improvements is the IT industry. That new server was updated at the same time as you signed off the order. the result is newer IT equipment is more and more powerful and can do more the equivalent item it replaces.

Although there is an argument on why do you need all this power, when what you have still works fine, the advantage of changing to new is that you use less power to do the same job.

Now it’s not that noticeable on equipment that is 6 months different in age, but when equipment is over 3 years old, it is noticeable. Most servers are running 24, 7, 52 weeks a year. There is no rest and as a result they are at the end of their life after 3 to 4 years.

Is it time to change? How much can you save, will that help contribute to that upgrade?

Do you want more information on this? For many of the above points the cost to implement these changes is minimal compared to the savings that may be achieved. Not only that a full and healthy data centre will reduce failures and downtime, when these savings are considered, then it makes sense to act now.

Temperature Control will be soon be offering a health check service to check the current state of the data centre and advise on how these changes can benefit – to register interest please see link www.temperaturecontrol.squarespace.co.uk

Leave a Comment

This blog is kept spam free by WP-SpamFree.

Previous post:

Next post:

</