#TheCloudShow – S2E4 – Cloud and the Edge

Edge computing can be defined as a “mesh network of micro data centres that process or store critical data locally and push all received data to a central data centre or cloud storage repository, in a footprint of less than 100 square feet.”

Now there are some crystal ball gazers who are saying that Cloud is dead and Edge computing is the future.

But, where did Edge come from? A product called NATS started this trend, where enterprise messaging systems and platform technologies could be decomposed into things called microservices using the processing capacity of devices at the end user rather in remote computing.

There is clear value in utilising the processing power of devices that are close to the end user. The key factor here lies in some basic physics – so if you did not pay attention in school about the speed of light, you might have to concentrate hard now.

Historically, the technology industry has gone back and forth between centralised computing and computing all happening at the end user. This is now being disrupted by things like Machine Learning and Artificial Intelligence.

As humans, we understand linear processing – our basic instincts demand responses like fight or flight. Essentially we are 50 thousand year old wet ware – as a result we struggle understand exponential growth.

Artificial Intelligence & Machine Learning are seeing exponential growth – what is happening now is basic compute is limited by the speed of light, namely how fast the compute takes as it goes up and down the internet.

This is where Edge computing brings that distance for far away data centres that cloud traditionally runs on and brings it to the devices that are closer to where the action is.

The argument is that software and compute will move closer to consumer in order to meet their insatiable demand for instant access. The fastest response will lie in telco providers providing 5G and in the devices themselves to process the demand. Fundamentally, this is about what is known as latency, namely the time it take for data to go up and down the pipe.

Or is it?

+ posts

Meet Disruptive Live!


Related articles

The three stages of AI in innovation

We’ve all heard a lot about AI recently, fuelled...

CIF Presents TWF – Cécile Rénier

In this tenth episode of our weekly news and...

Discover The Final Way to Cut Cloud Costs

This kind of technology is available now and easy to install. It allows you to finally achieve the time and cost savings promised by the cloud all those years ago.

Application observability is the foundation for sustainable innovation

It’s become almost clichéd within IT to talk about...

“Computer – Enhance!” Using AI to maximise developer capabilities.

Developer teams are facing ever-increasing pressure to innovate and...