#TheCloudShow – S2E4 – Cloud and the Edge

Edge computing can be defined as a “mesh network of micro data centres that process or store critical data locally and push all received data to a central data centre or cloud storage repository, in a footprint of less than 100 square feet.”

Now there are some crystal ball gazers who are saying that Cloud is dead and Edge computing is the future.

But, where did Edge come from? A product called NATS started this trend, where enterprise messaging systems and platform technologies could be decomposed into things called microservices using the processing capacity of devices at the end user rather in remote computing.

There is clear value in utilising the processing power of devices that are close to the end user. The key factor here lies in some basic physics – so if you did not pay attention in school about the speed of light, you might have to concentrate hard now.

Historically, the technology industry has gone back and forth between centralised computing and computing all happening at the end user. This is now being disrupted by things like Machine Learning and Artificial Intelligence.

As humans, we understand linear processing – our basic instincts demand responses like fight or flight. Essentially we are 50 thousand year old wet ware – as a result we struggle understand exponential growth.

Artificial Intelligence & Machine Learning are seeing exponential growth – what is happening now is basic compute is limited by the speed of light, namely how fast the compute takes as it goes up and down the internet.

This is where Edge computing brings that distance for far away data centres that cloud traditionally runs on and brings it to the devices that are closer to where the action is.

The argument is that software and compute will move closer to consumer in order to meet their insatiable demand for instant access. The fastest response will lie in telco providers providing 5G and in the devices themselves to process the demand. Fundamentally, this is about what is known as latency, namely the time it take for data to go up and down the pipe.

Or is it?

+ posts

CIF Presents TWF - George Athannassov

Newsletter

Related articles

Hybrid IT Infrastructure: Integrating On-Premises and Cloud Solutions

Blending traditional on-site tech with cloud services creates a...

Why APIs are the new frontier in cloud security

Organisations are seeing a proliferation of APIs (Application Programming...

Tackling AI challenges in ethics, data, and collaboration

We can safely say that artificial intelligence (AI) was...

The evolution of the CISO

What began as a technical innovation on the hacker...

Building Trust: Uniting Developers & AppSec Teams

The relationship between developers and security teams has typically...

Subscribe to our Newsletter