By Luke Wheeler

Any business that owns or operates within a data centre will know that power consumption has become a major – if not the key – factor determining the costs of co-locating your computing infrastructure within a purpose-built facility. Electricity costs in the UK have nigh on doubled over the last seven years (source: Castle Cover utilities Index) and green initiatives from central government with both stick and carrot approach are using increasingly strong-arm tactics.

…with electricity costs escalating and the pressure to be green increasing, it pays dividends to ensure that you’re not wasting power.

So with electricity costs escalating and the pressure to be green increasing, it pays dividends to ensure that you’re not wasting power. And with your computing infrastructure consuming a whole load of power – this is where PUE comes in.

PUE is Power Utilisation Effectiveness. It’s a metric used to determine the energy efficiency of a particular data centre. It’s calculated by dividing the total amount of power entering a data centre by the power actually being used to run the computing infrastructure within it. To place this in an example:

If the total power entering the data centre is 375kW and the actual power being used by your computing infrastructure is 250kW, then your PUE value is 1.5. It may be easiest to think of it as a ratio (1:1.5) where “1” is your computing infrastructure power consumption and 1.5 is power consumption of the whole data centre. In this example, the data centre facilities are using 125kW (50% of your compute infrastructure’s power consumption) in order to deliver the facilities (i.e. the UPS, cooling, lighting etc). The more efficient the data centre – the lower the relative and proportional power overhead of providing the facilities will be, and the lower the PUE value will be (assuming there’s no reduction in service levels). Currently the most efficient data centres are reporting PUE values of around 1.07 and the least efficient languishing around 3.0, with an overall industry average of 1.8-1.89 (according to The Uptime Institute, 2012)

What this means to you and your business

As data centre costs have moved away from being governed by square footage towards power consumption, a substantial proportion of the costs incurred for running your data centre operation will be directly attributed to the power you use. Whether you have a whole building, a floor of a 3rd party data centre, or just a rack in a hosting facility, the price you pay for you power will be directly impacted by the PUE of the data centre as a whole. Again it matters not whether you are on an ‘all-inclusive’ tariff or metered power tariff – the data centre will be recovering what it costs them to power the data centre and your proportion of it. Put simply, the lower the PUE, the lower the price you’ll end up paying for power.

Put simply, the lower the PUE, the lower the price you’ll end up paying for power.

So how much money can you potentially save by co-locating in a highly efficient – low PUE – data centre?

Here is an illustration of how you could save £41,000 per annum based on a typical 5 rack deployment when you choose a data centre with a PUE of 1.25 over a data centre with PUE of 3.0:

Of course in practice your non-power-associated costs (for space) will probably cost more in an ultra-efficient data centre than they would in a low efficiency data centre, but the total cost of ownership (TCO) is still probably a lot less – and your green credentials will look a lot healthier too.

Can the PUE be impacted by other Customers?

When you’re in a shared facility, the activities or non-activity of other customers can affect the PUE of the data centre as a whole. If, for instance, you’re in a relatively empty data centre facility, a fair amount of total power may be cooling empty space, making it less efficient overall. If you’re in a busy facility, the data centre is actually likely to be working more effectively, despite computing infrastructure power and overall facility power draw being higher. As you might expect, the PUE in these scenarios boils down to the how well the facility is managed, and most data centre operators will typically factor these variations in so that you have a consistent PUE to work with.

So is PUE the ultimate measure of a data centre?

…how can facility power be kept to a minimum? Well if you didn’t need cooling at all, that would be a huge advantage…

If you’ve followed our PUE explanation, you’ll see that the best PUE opportunity exists where the power required to support the data centre facilities is the smallest fraction of the power that goes into the computing infrastructure. So how can facility power be kept to a minimum? Well if you didn’t need cooling at all, that would be a huge advantage! This goes some way in explaining why there’s been an increase in data centre builds in year-round colder climates – locations such as Iceland, Greenland and the Nordics. Achieving the same PUE in equatorial and tropical regions would be a challenge indeed. Whilst co-locating in cool climates may give you the best PUE and greenest credentials – and possibly highly competitively priced renewable energy, you still have to factor in your other data centre requirements. These are likely to include potentially mission-critical factors such as connectivity, accessibility, physical security, access to technical support, data location, and contractual flexibility.

To sum up then, PUE is a measure of the efficiency of the data centre as a whole, and it will directly or indirectly impact the operating costs of your co-location. Its not the ‘be all and end all’ as your data centre operation is likely to need to fulfil a broader brief of requirements. A better PUE value should help you achieve better value from your data centre power provision – and also give you a better understanding and the ability to model your exposure to inevitable future power price increases. Of course – to complete the picture – you should also examine how power efficient the hardware within your racks is.

+ posts

CIF Presents TWF - George Athannassov


Related articles

Hybrid IT Infrastructure: Integrating On-Premises and Cloud Solutions

Blending traditional on-site tech with cloud services creates a...

Why APIs are the new frontier in cloud security

Organisations are seeing a proliferation of APIs (Application Programming...

Tackling AI challenges in ethics, data, and collaboration

We can safely say that artificial intelligence (AI) was...

The evolution of the CISO

What began as a technical innovation on the hacker...

Building Trust: Uniting Developers & AppSec Teams

The relationship between developers and security teams has typically...

Subscribe to our Newsletter