When COVID-19 started causing havoc around the world and forced governments to implement lockdown measures in an attempt to slow its progress, daily life changed quite significantly for most people. Business operations of all kinds were disrupted — offices closed, inessential brick-and-mortar stores shut down, and jobs were lost — and a lot of panic resulted.
On the whole, though, it’s remarkable how well organisations everywhere (private or public) have adapted to these difficult circumstances, and it’s mostly due to the convenience and ubiquity of the cloud. Without it, the sudden move to remote working would have been too much for many: if this pandemic had struck 15 years ago, the effects would have been very different.
After a short period of remote working being standard practice and traffic dropping enormously across the board, you probably saw the stories about pollution levels dropping drastically and air quality improving proportionately. People started to focus on the positives of the lockdown: after the pandemic is eventually thwarted, surely we should stick with this new approach to business.
There is one major concern that the new business world has highlighted — and that’s the energy use of cloud technology. In this post, we’re going to consider how sustainable the cloud really is, and what can be done (or is already being done) to make things better:
The drain of early blockchain technology
Concerns about the energy use of technology really hit the mainstream when cryptocurrencies started to pick up steam and people everywhere invested heavily in crypto mining. High-power GPUs were set up to run at full capacity on a 24/7 basis, all in the hope of producing some profit, and environmentalists were understandably frustrated. The proofs being generated weren’t valuable outside of finding and allocating coins: it was purely about making money.
Over time, things began to change. Blockchain itself matured, being embraced as a decentralisation framework by ambitious enterprises such as Everipedia, a more open-minded alternative to Wikipedia — and the benefits of pursuing crypto went down at the same time as manufacturers started disincentivising the use of consumer-grade hardware for mining.
Computing solutions to environmental problems
Blockchain didn’t just become more energy efficient as it grew: it also started to be used to address environmental issues. Companies arose to use cryptocurrencies to encourage the trading of resources like additional electricity from solar panels. Simultaneously, cloud processing was brought to bear as a powerful tool for calculating energy solutions.
Machine learning deployed at a massive scale can create models on everything from climate change to the potential efficacy of alternative energy systems — models that go far beyond what we could ever achieve without it. There’s still so much that we don’t know for sure about these things, which makes it extremely difficult to make things better because any given change could ultimately have unforeseen consequences, but the cloud can lend us prescience.
Yes, cloud computing requires an enormous amount of electricity to function, but if it ultimately devises new ways to reduce consumption elsewhere and produce more efficient systems, then it’s sure to constitute a net positive in the long run. It’s a sacrifice that must be made.
Huge data centers are actually more energy-efficient
Lastly, the biggest reason why the sustainability of cloud computing isn’t as big a problem as some think is that it’s actually more efficient than conventional computing. Think about various personal computers working hard to process calculations: the parts will vary in energy efficiency due to their age and quality levels, particularly since improvements in processing technology consistently make processors more energy efficient.
Now redistribute that work to one massive data center that features all the latest hardware and software innovations. Everything is built around optimisation. That center will use an incredible amount of power, certainly, but not compared to the net energy use of all the home computers that would handle the tasks otherwise.
If every personal computer in the world could be replaced with something like a cloudbook that passed almost all of its processing to the cloud, energy use would go down quite heavily. You can contend that it would be even better if less processing were done overall, but there’s no putting that genie back in the bottle: we’ll all continue to use computers all the time regardless, and cloud computing is by far the lesser of two evils.
With so much work being done remotely, it’s understandable that people are getting concerned about the sustainability of cloud processing, but there really isn’t much reason to be worried. Using the current form of cloud computing is already vastly more efficient than relying on personal computers to get things done, and technology will continue to improve at a rapid clip. If we want a more energy-efficient world, we need to embrace the cloud, not shy away from it.