Home Opinions A local cloud for local people

A local cloud for local people


Philip Grannum, Managing Director, Hosting Operations, Easynet Global Services considers the advantages that localisation has for the delivery of cloud computing services.

I’ve long been aware of the critical nature of location in the decision to site a data centre, factoring in such considerations as availability of power infrastructure, telecom access and physical security.  In the Cloud world, however, on the face of it things should be simpler. Data should be accessible from anywhere and the physical hosting location shouldn’t matter – or so say the Cloud purists. Content doesn’t observe national borders. Indeed a senior Engineer at one of the US Cloud leaders admitted that they had initially assumed that two US-based data centre locations would be sufficient to operate a Cloud service.

This US-centric view perhaps portrays a certain level of naiveté when viewed from today’s post-NSA market situation, and indeed the same provider is expanding its footprint across the globe. In reality much content or data is anchored to its home market.

At a philosophical level too I’m not convinced that concentrating most of the content and data into a small number of mega data centres, operated by a small number of providers, is necessarily the right model. The Internet is very robust due to its ubiquity and distributed architecture.

The most natural location for data is close to the user, and so a more dispersed Cloud has potential end-user benefits. Because of this, at Easynet we locate our Cloud hosting services physically in the principal markets we serve. Europe is not a uniform market and local market needs and concerns are in many cases most readily supported with local in-country infrastructures.

Data sovereignty
There has been much discussion recently about the free migration of labour within the EU. Labour may be free more or less to move, but it is not yet be so easy for people to take their data with them! Each country in the EU has its own data protection agency and whilst there is movement towards a common framework in the EU, each is currently operating in a different way within a different regulatory framework (and I’m sure that the good folks of Royston Vasey have their own ideas about data protection too!).

Some countries like Iceland and Switzerland have been touting themselves as potential safe havens to host data outside the main regulatory blocks. The sovereign state of Sealand made its own similar attempt.

There’s a useful guide here for those interested.

There are barriers to data migration. There generally needs to be explicit consent of the data owner to move their data outside their borders.

There are barriers to data migration. There generally needs to be explicit consent of the data owner to move their data outside their borders. Also, even if a Cloud provider’s infrastructure is located within the desired region of operation, if their staff located outside of that geography have potential access (e.g. root access to a server with unencrypted data), then that can be judged as non-compliant from a data protection perspective i.e. the support structure and method can be as important as the physical location.

The EU is implementing a new regulatory regime with a view to providing a more harmonised approach across EU, for which there is an excellent guide here.

However, in general the message for Cloud companies, and their customers, is that the regime is getting tighter, including the potential obligation to notify the national supervisory agency of data breaches within 24 hours; the rights of data portability for consumers; and the right to be ‘forgotten’.

In the UK, the Government has its own data security policy, which in practice means that UK hosting managed by UK security-cleared staff is the most practical option for many classes of data. Easynet’s G-Cloud accredited hosting is operated wholly onshore in UK for this reason.

Industry sectors like Financial Services, Pharma, Healthcare, Telecoms and Utilities are also subject to further data regulation requirements specific to their industry and data types over and above general data protection provisions. In a post-NSA world with an increased awareness of Cyber threats these regulations are only going to get tougher.

Now I’m not going to enter into a long discussion here about the relative security benefits of Cloud vs. operating on-premise. Data which is not connected to anything and with no means of direct access to it is the most secure but also most useless. Thereafter every decision is a balance of risk between ease of use and control.  It is certainly the case that Cloud and hosting companies have a depth of expertise in security management which exceeds that of many of their customers who can benefit from that, in building greater security into their controls. It is also true that multi-tenant Cloud may not provide optimum balance between risk and access for some critical data, particularly when the data owner has to prove compliance to a sceptical auditor.

It is certainly the case that Cloud and hosting companies have a depth of expertise in security management which exceeds that of many of their customers who can benefit from that.

Indeed the practicality of conducting an audit is one area where the location can be an important consideration. In proving compliance, be it PCI or some other standard the customer is working to, it is often required that the auditor can have access to the facilities where data is both hosted and accessed from. This can be a lot easier and less expensive to organise if conducted within the country where the host company and auditor is located.

A storage salesman once said to me that virtual machines can move easily but data is heavy. A nice way of saying that whilst machine images are easy to move around it can be difficult and costly to move around large quantities of data.

This impacts when migrating existing systems and data to the Cloud If you have a virtualised environment it is often possible to migrate machines (subject to compatibility etc.) into a Cloud environment: some Cloud providers won’t allow you to migrate VMs and require a fresh build, but many will allow this. However, if the data is large it may be impractical or costly to migrate over the wire – if connectivity is not already in place it can be expensive to put a large link in for a short term data transfer and some providers, levy high volume-related charges for data transfers. This may mean that the most practical solution for a large deployment is a physical transfer on media to the host data centre. Again, when up against a short time window, which is typically the case for critical infrastructure, this is usually most practical if the target data centre isn’t too far away. Adding the complexity of cross border migration can introduce additional time and risk.

The second place it hits is performance. Latency can negatively impact the user experience, particularly if the application is chatty between a user machine and the host. For some applications distance doesn’t matter so much or can be ameliorated by using CDN or WAAS. However in general Cloud, it is a lot easier if the target hosting location isn’t too far from a latency perspective from the end users, if not in-country then at least in the same region of the globe.

If you’re replicating a lot of data between two sites it is also useful if they stay within the same geography. Carriers generally charge a lot more for international transport than domestic circuits at the same bandwidth. It may also be the case (particularly for synchronous applications), that staying within a metro, or proximate to a metro, can have significant advantages (hence the location of data centres in a ring around London providing backup infrastructure to clients located within the M25).

Are these practical considerations going to erode rapidly so we end up in a borderless data world, with Cloud operations mainly in mega data centres operated by the industry giants, and where data sovereignty becomes an anachronism? Or will we increasingly move towards a more diverse and distributed model with a higher degree of localism and variety? I hope it’s the latter. What do you think?