I never thought such huge savings would be possible from a replacement of a ten year old computer system, but here we are. Normally when I think of energy savings opportunities, I think of HVAC equipment replacements, Variable Speed Drive installations, improvements to control sequences, lighting upgrades, and even electric vehicles. So, when I spoke to a company that said we could achieve big savings by upgrading computer infrastructure I was intrigued.

At the Princess Margaret Cancer Research Tower (PMCRT), Dr. Igor Jurisica’s research lab conducts cutting edge experiments using computational biology with the goal of developing new cancer treatments. Please take a look at his website for more information as I would probably do a disservice trying to explain it here. The massive computational power required to run biological simulations is provided by a High Performance Computing (HPC) mainframe with an estimated performance of 6.2 teraFLOPS and total memory of 2688 gigabytes. According to benchmark testing data I found online, my work laptop computer operates at 12.92 gigaFLOPS, so Dr. Jurisica’s HPC system is approximately 500 times more powerful!

Original System

The original system consisted of a number of IBM System x3650 Servers and a Voltaire switch housed in a mainframe about the same size as five refrigerators lined up side by side. The system generated a lot of heat and required a supplemental 30 ton cooling system. If the cooling system tripped, this room could reach temperatures of up to 60 degrees Celsius! The old system is pictured below.

In order to measure the electricity savings associated with this project, we metered the two electric panels that supply the HPC equipment and the electric supply to the cooling system. The meters were installed for a week both before and after the installation of the new equipment and Dr. Jurisica’s staff ran simulations that approximated the typical usage patterns in order to produce an equivalent comparison that could be extrapolated over a year. The chart below shows the electric metering of the original system in June 2016.

Pre-retrofit Baseline Meter Data
Original HPC System Electric Metering

As you can see in the graphic above, the total system (blue line) uses about 25 kW during low usage, about 60 kW during moderate usage, and 80-100 kW during peak usage, with a peak power consumption of 114 kW. Extrapolated over the course of a year, the system used about 520,000 kWh.

New System

The new system, which was designed to produce similar computing performance to the original system, consists of a number of IBM NeXtScale nx360 M5 compute nodes with supporting hardware and switches. The system was designed and implemented by Scalar Solutions. It is somewhat shocking to see how fast computer technology has advanced in 10 years time – the new configuration has reduced the size of the mainframe from 5 refrigerators to about the size of a filing cabinet. The new equipment produces so little heat that the old 30 ton cooling system had to be removed and replaced with a residential sized 2 ton AC unit!

New HPC System

Similarly to the base case, three meters were connected to the two electric panels and the chiller for a week during May 2017 to measure the electricity consumption of the new system. The chart below shows the unbelievable results.

Post-retrofit Meter Data
New HPC System Electric Metering

As you can see, the total system (blue line) uses about 4 kW during low usage, 5-6 kW during moderate usage, and about 8 kW during peak usage, with a peak power consumption of 8.5 kw (compared to 114 kW for the original). Extrapolated over the course of a year, the new system is expected to consume about 44,000 kWh (compared to 520,000 kWh for the original).


Thanks to dramatic improvements to the efficiency of computer technology, the savings of this project are extreme. While maintaining existing computing performance, the retrofit reduced our peak demand by 92.5% and our overall power consumption by 91%! These electricity savings result in cost savings for UHN of over $50,000 per year, with a payback of under 3 years. The project also provides equipment renewal and a more reliable HPC system for Dr. Jurisica’s lab.

I would like to thank Dr. Jurisica and Christian Cumbaa for providing support during the monitoring and verification process as well as Ian McDermott and Carolyn McGinley for support from the UHN Research infrastructure team. Thanks also to Jana Jedlovska and Toronto Hydro for lending us the power meters that we used to conduct the electricity measurements.

Hopefully you found this case study interesting and it gets you thinking outside the box on energy savings opportunities at your organization!