Moving to the cloud: A security breach waiting to happen?

When Gartner predicted at the start of the year that 95 per cent of cloud security failures would be a customer’s fault, many in the industry took a sharp intake of breath.

[easy-tweet tweet=”Policy, process and people need to join together to ensure every application decision is thought through” hashtags=”Cloud”]

It’s true there’s often a sense among IT security teams that they can no longer protect what they no longer manage at their fingertips, despite knowing that moving to the cloud brings great benefits. It’s akin to leaving your child at a nursery – a tough decision to make and often filled with a natural sense of fear but the right thing to do on balance.

And it’s right that Gartner highlights the concern. Security needs to be at the heart of all IT decisions. There have been too many high profile breaches for complacency.

But it’s not necessarily true to say that adoption of the cloud is impeded because of trust however.

The reality is that there are many established cloud service providers that have better general security operations and practices than some enterprise organisations.

It makes sense. Knowing security concerns have been a deterrent to adoption, large providers have put a huge emphasis on establishing strong standards – you can now find ISO standards, assurance registries and frameworks for establishing which providers you can trust.

Having said that, that’s not enough. There are still risks. Inevitably, when you share resources with others there is always the potential for ‘collateral damage’.

No one wants to be a cyber domino but the possibility exists. If the service provider is overwhelmed by a DDoS attack then unintended victims can be brought down too. Hackers go for glory and a service provider is a honey pot for causing maximum disruption to customers.

Similarly, if an application running on the cloud has a vulnerability that is breached, the breach could lead to unauthorised access to other parts of the cloud environment.

So while there is a common thought that eventually everything will be in the cloud, it’s not surprising that 79 per cent of IT professionals think that some applications will always need to be hosted on premises.

That’s because there are applications that require zero latency as they are tied to physical operations, or they are bound by data sovereignty issues that won’t allow data to be hosted elsewhere. These scenarios depend on security and so the decision to move lock stock isn’t as simple as it sounds.

The very nature of having on and off premises applications means you make multiple demands on the security team. Often that’s where trouble lies. It’s not unusual for organisations to suffer a breach and discover it was because of an application no one in the security team knew about.

That’s in part why Gartner’s prediction was so severe. Finding and deploying cloud services will likely reduce some of the discipline around application security, from creating secure code to ongoing maintenance and management.

This is where policy, process and people all need to join together to ensure every application decision is documented and thought through. But with a pressure to manage security operations across a company’s own data centres and networks, as well as across multiple cloud providers it’s easy to see how a ‘hybrid’ strategy could run into problems.

If you’re supporting the orchestration of security policies across multiple environments, then there will be added pressure on stability and agility. No security team would want to stand in the way of being able to respond to market demands but equally keeping the status quo protected is a full time job.

So what’s the answer? Stepping back is never a bad thing to do. It creates the time to consider how you’ll support the company strategy that needs to be delivered in the next 12 months and the next 10 years. For example, if you’re in an industry where IoT will be fundamental to your success then the applications you’ll be dealing with could grow as quickly as the devices you have connected to the network. That brings risk.

There’s therefore no harm in taking a risk-based approach to considering the options and determining the best cloud adoption when faced with strategic scenarios like this. It’s really important to assess the impact of a breach, the applications that will render operations useless if they suffer any downtime, right through to the merits of a private versus public cloud strategy, and what the service provider can offer – will application layering streamline management for example?

With such an approach you can determine where the priorities lie, how cloud technologies need to blend and where the weaknesses will be.

Of course, much of this analysis is about assessing the capability of the cloud against the strategy. But there is one fundamental development of the last six months that can also help with the decision making process – namely ‘always on’ cloud protection.

Always on detection and mitigation is a widely adopted technique in corporate networks and it makes sense to extend what you already have to the cloud. Why wouldn’t you take advantage of cost efficiency, scalability and working with technology you already know?

If you knew your infrastructure, no matter its guise, could detect very specific threats like an SSL attack and keep data anonymous you’d take it. And you’d be even more inclined to adopt if the service also recognised new applications had been added, needed to be protected and were automatically brought into the security fold.

Given the applications management complexities described, and the constant need to stay ahead of the competition, having one less thing to worry about will be a boon to the security team. The inadvertent breach will be far less likely and if the worse happened the defences would go up automatically.

[easy-tweet tweet=”Managing #security policies across multiple environments, will add pressure on stability and agility” hashtags=”Cloud”]

Gartner’s predictions were well founded, but these are the advancements that reduce the risks and make the decisions about how and when to adopt cloud more clear-cut. But most importantly it makes reaping the benefits of cost saving and operational efficiency that the cloud promises a definite reality.

Adrian Crawley is the VP of sales for Synack’s EMEA division. With a long career in cyber security, Adrian advises companies on strategy, and in particular how to secure the enterprise and applications using crowdsourced security testing models.

AI Readiness - Harnessing the Power of Data and AI


Related articles

CIOs and CISOs Battle Cyber Threats, Climate, Compliance

CIOs and CISOs face unrelenting pressure from three massive...

Discover the Power of On-premise Cloud Innovation

For most organisations, the shift from on-premise to the...

The AI Show – Episode 8 – Theo Saville

In episode 8 of the AI Show, our host...

The Data Conundrum: How sustainable is its future?

In this article, Dan Smale, Senior Service Owner of...

Adopting open architecture for robust data strategy

As the world's economy grapples with continuous challenges and...