Story image

Beyond renewables: Emerging technologies for “greening” the data centre

16 Mar 2019

Article by Park Place Technologies president and CEO Chris Adams

Facebook recently made news with the opening of its data centre in Los Lunas, Arizona, USA, which will operate exclusively on renewable energy. This project joins ventures by companies like Italy’s Aruba.it  and Australia’s DC Two in prioritising alternative energy.

As important as renewables are to long-term sustainability, there are additional environmental considerations. From water use to raw material inputs, data centres are huge consumers of natural resources. They will need to realise greater efficiencies to minimise their impacts as global data volumes and the demand for intensive processing skyrockets.

Fortunately, there are several emerging technologies that can help slim the industry’s ecological footprint in the coming decades.

#1 Liquid hardware cooling

In the immediate future, liquid cooling could be among the most significant contributors to overall energy savings. Various liquid-cooled hardware is already in production. Lenovo, for instance, announced rear-door heat exchangers capable of cutting power usage effectiveness (PUE) from the 1.5 to 2 range to about 1.3. And systems using direct-to-chip cooling could bring PUE down to 1.1.

The time is also ripe for full immersion cooling, dunking sealed components in dielectric fluid. This approach eliminates server fans, reduces CPU power consumption, and takes the pressure off data centre air conditioning. Estimates for power savings reach as high as 50 percent.

Immersion cooling can also achieve three times the density of traditional Intel processors, which means delivering more computing power on less land. A final benefit is that immersion cooling reduces the effects of heat and humidity, helping to extend component lifespans. This means fewer drive replacements and lower overall material inputs.

#2 AI-driven data centre infrastructure management solutions

Artificial technology has a role to play as well, and Google is a leader in this field. The company recently upgraded from a 2014-era machine learning-based “recommendation engine” requiring manual infrastructure adjustments, to a proprietary, automated infrastructure management solution. The system evaluates over 120 variables, from fan settings to the dew point, in order to identify inefficiencies and optimise PUE. The technology is shaving 30 percent off the company’s cooling system energy use, and savings could reach 40 percent.

#3 Increased server utilisation

Speaking of AI, it has a lot to offer for server utilisation. Data centre operators recognise that zombie servers are a problem – one that’s not going away with virtualisation alone. While some 25% of physical servers may be comatose, estimates are that 30% of virtual servers have seen no activity over at least six months.

Arriving to remedy this problem are AI-based load balancers that can detect ghost servers and distribute workloads across available hardware based on server performance, disk utilisation, network congestion, and the energy efficiency of each piece of equipment.

IDC predicts that 50 percent of IT assets will have autonomous operation capabilities by 2022. Turning server utilisation over to the machines will help overcome human inertia, which sometimes prevents manual workload retooling that is expected to return only incremental efficiency gains.

#4 Quantum computing

Further into the future is the promise of quantum computing. A study by Oak Ridge National Laboratory, a U.S. Department of Energy facility, found that quantum computers could reduce energy needs by more than 20 orders of magnitude over their conventional counterparts.

The prospect has a lot of money flowing in. The government of British Columbia, Canada, for example, has spent millions with D-Wave, backing its promises to address climate change. Venture capital investment has reached $250 million per year, and some governments—the EU, China, and the USA—are pumping billions into research and development.

Some experts worry that the current status of quantum computing is overhyped. It’s true that today’s quantum computers remain prone to errors that are difficult to correct. The need for an entirely different programming approach is another core barrier.

Nonetheless, IBM has unveiled a commercial quantum computer, the Q System One. IBM is also among the companies making their quantum computers and accompanying programming kits publicly available online.

Unfortunately, these systems are of little practical value yet, and quantum computers won’t simply replace laptops anytime soon. For near-term applications, quantum computers will most likely serve as “accelerators” within major providers’ clouds. The goal would be to identify when workloads can benefit from a quantum system’s computational fortes and tap those resources on a targeted basis.

The true breakout of quantum computing is likely more than a decade off, but we still look forward its potential once the field has matured.

#5 Increasing dominance of hyperscale providers

Hyperscale providers are in a position to leap to these types of efficiency technologies, and the large financial impact of the sometimes marginal savings can provide them with the incentive to do so.

It’s hard to imagine interests with less of a budget than Microsoft, for example, exploring how to site data centres under the ocean to benefit from the naturally cool temperatures.  Similarly, Google’s own AI-based DCIM cannot be deployed without deep learning at a particular site, so an equivalent commercial system capable of rolling out to more diverse data centres is still many steps away.

There are certainly reasons to worry about market consolidation, but as an increasing share of data storage, processing and network traffic moves to hyperscale providers—with their share expected to reach 57%, 69% and 53% respectively by 2020—the world is shifting toward those companies that can best leverage emerging technologies to enhance sustainability.

Trend Micro introduces cloud and container workload security offering
Container security capabilities added to Trend Micro Deep Security have elevated protection across the DevOps lifecycle and runtime stack.
Veeam joins the ranks of $1bil-revenue software companies
It’s also marked a milestone of 350,000 customers and outlined how it will begin the next stage of its growth.
Veeam enables secondary storage solutions with technology partner program
Veeam has worked with its strategic technology alliance partners to provide flexible deployment options for customers that have continually led to tighter levels of integration.
Veeam Availability Orchestrator update aims to democratise DR
The ability to automatically test, document and reliably recover entire sites, as well as individual workloads from backups in a completely orchestrated way lowers the total cost of ownership (TCO) of DR.
Why flash should be considered the storage king
Not only is flash storage being used for recovery, it has found a role in R&D environments and in the cloud with big players including AWS, Azure and Google opting for block flash storage options.
NVIDIA's data center business slumps 10% in one year
The company recently released its Q1 financial results for fiscal 2020, which puts the company’s revenue at US$2.22 billion – a slight raise from $2.21 billion in the previous quarter.
Limelight Networks celebrates 100th point-of-presence launch
The company has increased its global network capacity by 40% in just five months, bringing its total egress capacity to 42Tbps.
Salesforce continues to stumble after critical outage
“To all of our Salesforce customers, please be aware that we are experiencing a major issue with our service and apologise for the impact it is having on you."