Superconductivity is the property whereby a material exhibits zero electrical resistance, meaning absolutely no energy is wasted and ensuring optimum efficiency. It has the potential to bring revolutionary advances to high end computing, delivering processing speeds far in excess of what is currently possible and yet, it is not a newly discovered phenomenon. It was first observed in 1911 by Dutch physicist Heike Kamerlingh Onnes, but the world still waits for superconductors to achieve mainstream use.

[easy-tweet tweet=”Superconductivity is the property whereby a material exhibits zero electrical resistance” user=”bsquared90″ hashtags=”datacentre”]

The problem lies in the fact that getting materials to exhibit superconductivity is easier said than done. It is only witnessed in approximately 25 elements and predominantly at extremely low temperatures, often very close to absolute zero (−273.15°C). The energy required to cool materials to this extent would negate any efficiencies gained from having zero electrical resistance. As such, achieving superconductivity at higher temperatures continues to generate much scientific research.

There have been some breakthroughs and scientists have now been able create superconductors at temperatures of -70°C, still impractically cold for most applications, but a sign that, theoretically at least, there is no reason why room temperature superconductors can’t be developed in the future.

Superconducting magnets are used in CERN’s Large Hadron Collider to accelerate sub-atomic particles

With temperature thresholds still extremely low, applications for superconductors are somewhat limited. Superconducting magnets are used in CERN’s Large Hadron Collider to accelerate sub-atomic particles, while some large electrical providers have experimented with superconducting materials to improve efficiency. However, it is the future applications of superconductors that really promise widespread benefits.

[easy-tweet tweet=”Superconductors offer a potential solution to the inevitable problems facing the #datacentre industry” via=”no” usehashtags=”no”]

Friday Night Cloud: Episode 2In particular, superconductors offer a potential solution to the inevitable problems facing the data centre industry. The growth of cloud computing, Big Data and related modern phenomena is placing greater pressure on data centres in terms of their processing capacity and reliability, even without considering the vast amount of energy that they consume. In the US, this is expected to reach 200 TWh by 2020, representing approximately five per cent of the country’s total electricity consumption. The limitations of the semi-conductor technology that data centres are built upon, leading to wasted energy and the ever-increasing difficulty of meeting Moore’s Law, are likely to place significant pressures on the industry in the years to come.

data centres stand to benefit from a much higher level of efficiency

Superconductors offer a potential solution to these data centre challenges, even if they remain theoretical for the time being. Scientists at the National Institute of Standards and Technology are experimenting with the use of two superconducting electrodes to create the 0 or 1 binary values required for superconducting digital computer memory. If they are successful and can scale the technology, then data centres stand to benefit from a much higher level of efficiency. It could, in the words of NIST’s Ron Goldfarb, “revolutionise mainframe computation and data storage within a decade.”

superconductor
Image by Julien Bobroff, Frederic Bouquet, Jeffrey Quilliam, via Wikimedia Commons

 If sustaining computer memory at the low temperatures required for superconductivity can be achieved, it is not only commercial data centres that stand to benefit. Superconductor research is currently taking place that could lead to the next generation of supercomputers.

The US government has plans in place to create what would be the world’s fastest computer, one capable of making a quintillion calculations every second (also known as an exaflop), but just as with data centres the limitations of semi-conductor technology are beginning to prove prohibitive. Superconductors offer a way of powering these huge computing resources with far less wasted energy. For example, existing supercomputers consume approximately 10 megawatts of power in order to deliver 20 petaflops of computation. By contrast, superconductor computers promise 100 petaflops of performance for just 200 kilowatts of energy.

[easy-tweet tweet=”The US government has plans in place to create what would be the world’s fastest computer #exaflop” via=”no” usehashtags=”no”]

+ posts

AI Show - Episode 1 - Clare Walsh

Newsletter

Related articles

From the lab to production: Driving GenAI business success

Generative AI (GenAI) was the technology story of 2023....

AI Show – Episode 1 – Clare Walsh

We invite you to join us on an extraordinary...

Generative AI and the copyright conundrum

In the last days of 2023, The New York...

Cloud ERP shouldn’t be a challenge or a chore

More integrated applications and a streamlined approach mean that...

Top 7 Cloud FinOps Strategies for Optimising Cloud Costs

According to a survey by Everest Group, 67% of...

Subscribe to our Newsletter