Edge computing and its big brothers in the hyperscale cloud are often painted as locked in some sort of dog-fight across the datasphere. The reality is each serves a different purpose. Cloud computing provides efficient, centralised storage at scale, processing data that is not so time-sensitive. It is constantly expanding. A report from Marketsand Markets, estimates the global cloud computing market size should grow from US$ 545.8 billion in 2022 to US$ 1,240.9 billion by 2027, which is 17.9% compound annual growth.

For a decade-and-a-half now, centralised cloud computing has been considered the standard IT delivery platform and little is set to change on that front. However, with new applications, workload services require an architecture built to support a distributed infrastructure – which is where edge computing is taking off.

Emerging problems with latency (time lag) and limited bandwidth when moving data have exposed the cloud’s shortcomings in certain situations. Edge computing is how the infrastructure market responds, providing capabilities to process time-sensitive data without latency. It is also preferred over cloud computing in remote locations, where there is limited or no connectivity to a centralised location.

The factors behind edge’s growth are only going to become more powerful. With the increase of remote working, edge data centres provide the important, reliable ‘last mile’ of connectivity, bringing critical data ‘nearer’ to users, increasing reliability of access, security and worker productivity. It addresses regulatory compliance requirements for information to be managed and processed in a specific area, and enables autonomy, offering the necessary separation without loss of performance where technology needs to function in isolation from a dedicated network. Its high bandwidth is ideal for systems generating vast quantities of data, which cannot feasibly be sent to be processed remotely.

Covid led to a surge in edge computing and has been followed by increases in video streaming and online gaming. In the wings are a myriad of industrial internet of things applications that industries are moving towards. All of which demands compute power at the edge, especially in remote locations away from central cloud hubs.

A global edge computing market size report covering edge computing trend analysis by components, applications, industry vertical and segment forecast, predicts global edge computing market size will reach $155.90 billion by 2030 – a compound annual growth rate of 39%.

Cyber-threats and outages

Being complementary rather than bitterly competitive, edge and cloud are set to prosper as data volumes continue to explode. Both, however, face a common enemy in cyber-crime. Cloud computing is centralised, which makes is more susceptible to threats such as direct denial of service (DDoS) attacks and outages. Breaches of multiple kinds proliferate, however. An Ermetic-commissioned IDC state of cloud security survey, conducted in the first half of 2021, revealed almost all companies (98%) surveyed had suffered at least one cloud data breach in the previous 18 months – a significant increase from 79% in the previous survey.

Edge’s vulnerability arises from the pressure on distributed networks imposed by the burgeoning consumer demand for faster, more efficient services. This ramps up the likelihood of outages.

Edge locations often have less redundancy built in, and no on-site engineers, which can make them less resilient than traditional data centres.

This new world of cloud and edge will require organisations to adjust their network management approaches to continue delivering the always-on uptime that customers expect.

New approaches to maintaining uptime

Providers now need proactive monitoring and alerting to keep their cloud infrastructures and edge data centres up and running. They need to ensure they can remediate networks without having to send an engineer on site. Their options include Smart Out of Band (OOB) Management tools which diagnose the problem and remediate it, even when the main network remains congested due after a disruption, or even if it is down entirely. 

A technology such as Failover to Cellular™ (F2C) provides continued internet connectivity for remote LANs and equipment over high-speed 4G Long Term Evolution (LTE) when the primary link is unavailable. Easily integrating with existing IT systems and network infrastructure, F2C restores WAN connectivity without boots on the ground or human intervention.

Organisations also combine automation with network operations (NetOps) for zero-touch provisioning of their Smart OOB devices. The benefit is to get the Smart OOB network provisioned and up and running, without risk of manual or human mistakes. Often, they will want to ‘zero-touch provision’ their own devices. And this technology is often employed to orchestrate maintenance tasks and automatically deliver remediation in the event of an equipment failure or other technical problem.

That effectively means that after shipping new or replacement equipment to site, an organisation uses Smart OOB to quickly bring the site up via a secure cellular connection. This allows for the remote provisioning and configuration of the equipment where it is, without having to send a skilled network engineer.

Companies using this approach achieve huge cost savings when implementing new edge deployments, especially those trying to do so rapidly in multiple territories. Then following deployment, if a problem develops that results in a loss of connectivity to the production network and one that cannot be resolved immediately, business continuity is upheld with organisations continuing to pass any mission-critical network traffic across the secure OOB LTE cellular connection.

Positive outcomes across edge and cloud

As cloud services expand and edge computing applications grow in sophistication and ease-of-implementation, organisations will have to adjust their network management processes to continue delivering the always-on uptime that customers have every right to expect. Achieving this will require hybrid solutions that fully exploit internet and cloud-based connectivity, as well as physical infrastructure.  By using the combined approach of Smart Out of Band with the latest NetOps automation, service providers will give themselves full confidence they have the always-on network access they need. In reality this is the surest way to deliver a level of network resilience that transforms delivery of cloud and edge capabilities.

+ posts

Alan Stewart-Brown is VP of EMEA at Opengear, with responsibility for overseeing all Sales, Channel Development, Marketing events and SE activities across the EMEA region. Alans’ primary focus is the development and execution of sales strategies, talent development and channel initiatives that will ensure the accelerated growth of the Opengear business across the region. Alan brings 25 years of sales leadership experience gained across the technology sector, including Wireless LAN, Enterprise Software, BI Analytics and e-Commerce. Before joining Opengear Alan held Senior Pan-European Sales Management positions at Xirrus, Fiserv, AIM Technology, eColor and Phoenix Technologies. Alan holds a Bachelor of Science degree from Imperial College, London.

CIF Presents TWF – Professor Sue Black


Related articles

How Businesses Should Tackle Big Data Challenges

In today's data-driven landscape, Big Data plays a pivotal...

UK IP Benefits and How to Get One

There are many reasons why you may get a...

Navigating the Landscape of AI Adoption in Business

In today's rapidly evolving technological landscape, the integration of...

Three Ways to Strengthen API Security

APIs (Application Programming Interfaces) are a critical driver of...

A Comprehensive Guide To The Cloud Native Database [2024]

Databases are crucial for storing and managing important information....

Subscribe to our Newsletter