#TheCloudShow – S2E4 – Cloud and the Edge

Edge computing can be defined as a “mesh network of micro data centres that process or store critical data locally and push all received data to a central data centre or cloud storage repository, in a footprint of less than 100 square feet.”

Now there are some crystal ball gazers who are saying that Cloud is dead and Edge computing is the future.

But, where did Edge come from? A product called NATS started this trend, where enterprise messaging systems and platform technologies could be decomposed into things called microservices using the processing capacity of devices at the end user rather in remote computing.

There is clear value in utilising the processing power of devices that are close to the end user. The key factor here lies in some basic physics – so if you did not pay attention in school about the speed of light, you might have to concentrate hard now.

Historically, the technology industry has gone back and forth between centralised computing and computing all happening at the end user. This is now being disrupted by things like Machine Learning and Artificial Intelligence.

As humans, we understand linear processing – our basic instincts demand responses like fight or flight. Essentially we are 50 thousand year old wet ware – as a result we struggle understand exponential growth.

Artificial Intelligence & Machine Learning are seeing exponential growth – what is happening now is basic compute is limited by the speed of light, namely how fast the compute takes as it goes up and down the internet.

This is where Edge computing brings that distance for far away data centres that cloud traditionally runs on and brings it to the devices that are closer to where the action is.

The argument is that software and compute will move closer to consumer in order to meet their insatiable demand for instant access. The fastest response will lie in telco providers providing 5G and in the devices themselves to process the demand. Fundamentally, this is about what is known as latency, namely the time it take for data to go up and down the pipe.

Or is it?

+ posts

Meet Stella

Newsletter

Related articles

“The need for speed” – Finding a way to unlock agility for today’s businesses 

To fully support agility, the solutions chosen will need to enshrine all the latest innovations in areas like artificial intelligence, machine learning or prescriptive analytics.

Preventing data sovereignty from confusing your data strategy

The reason why sovereignty is so important, is that it enables organisations to be innovative with their data and deliver new digital services. Historically, there has been a distinct lack of trust in the cloud, leading to a lack of innovation.

Edge and cloud joining forces

For a decade-and-a-half now, centralised cloud computing has been considered the standard IT delivery platform and little is set to change on that front.

Cybersecurity stress: how to safeguard your organisation and avoid IT burnout 

For organisations to improve their security, they must seek visibility of all the components that go into the software they use.

Improving industrial cyber defences in the cloud

t's an exciting period of change for the industry, as new technologies are woven into operations to streamline services and enhance the customer experience.

Subscribe to our Newsletter