When his firm was hit by a ransomware attack and files held hostage by attackers for more than $100,000, IT manager of construction company S.J. Louis, Rob Svendsen, did not panic. Despite the fact that all the records for recent and former jobs were on the servers, including $200 million worth of current construction work. That’s because S.J. Louis has been clever about its data, by being an early adopter of enterprise cloud file services; a holistic approach that moves the gravity of enterprise data to the cloud.
With enterprise file services, a file server hit by ransomware, accidentally deleted, or even drinking a beverage intended for human consumption, are not a major cause for concern since the data itself lives in the cloud. And with modern file services that employ clever caching technologies, fully recovering from such a catastrophic loss can be achieved in minutes. In the world of cloud file services, data follows the user as it flows between clouds, offices and endpoints. But as organisations make their transformational journey to the cloud, they face diverse challenges in the areas of infrastructure management, data governance, privacy and security.
Successful file services platforms combine four access methods, allowing organisations to achieve safe, seamless access to their entire global enterprise data. But first, let’s talk a bit about how things look in most enterprises today.
Legacy silo systems and unstructured data
Most organisations rely on legacy systems and architecture in their datacentres which hinders increased versatility. Within these environments data is typically unstructured, with project files located in silos, stored on users’ devices or located in branch offices across multiple locations. This lack of interoperability is not conducive to file protection in the cloud and can present many intractable and costly challenges, such as critical data loss.
The challenge of protection
How to prevent the loss of valuable information is a major conundrum for most enterprises. It is practically impossible to ensure the protection of data on servers located in distant branch offices, unprotected locations or on employees’ laptops. If a laptop goes missing, for example, vital data is likely to be lost with it. Hardware or system failure as well as ransomware and other viruses are very common causes of data loss. Predictably human error is often to blame, whether it’s a result of accidentally pressing that delete button, leaving your laptop on a train or dropping a mobile phone. People always make mistakes.
Another key issue is what we call ‘dark data’ which may represent either a lost opportunity or a security risk to a company. Dark data is dispersed digital information which is stored but unutilised. By adopting a file services platform, organisations can consolidate all files from all users, servers, NAS devices, etc., and integrate the data into a single solution.
The optimum file services platform will remove storage capacity limitations by dynamically caching files from any secure cloud to enterprise edge devices and desktop users. This allows users to access, share and protect an unlimited number of files in the cloud as if they were stored locally without being constrained by local storage capacity or security compromises.
The four methods for accessing cloud file services
A file services platform should enable the storage of all the organisation’s information, including unstructured files, making access seamless and simple from any location or device. Ideally a combination of four main methods should be employed:
- Web interface – The first method would be a web interface, where workers can connect to a web portal from which they can access all their files.
- Endpoint client – The second method is the installation of an agent on users’ laptops or desktop computers that allows them to work on files stored in the file services platform. This can be achieved either by synchronising a proportion of the files from the file services platform onto these devices, or by providing cached access. The agent client essentially enables users to see all of the information in their file services, with the structure appearing the same as it did prior to migration. The information that is most frequently needed will be cached locally while the rest of the data will be brought on-demand.
- Mobile app – The third access method is the utilisation of a mobile client, which allows users to access all an organisation’s information from a mobile device, a phone, or a tablet. The mobile client is the provision of an application that exposes the information so it can be accessed easily from a variety of devices.
- The caching gateway – While the first three access methods are likely to be familiar to individuals who store data in the cloud, this final method is not available on most platforms. The caching gateway is a client device that allows organisations to access all its documents in the file services platform in the same way that a file server or a NAS device would be accessed, using traditional protocols like SMB (server message block) or NFS (network file access).
A caching gateway enables the enterprise to seamlessly migrate existing file servers or filers into a consolidated worldwide file services cloud platform. But from the users’ perspective, there is little change, as they can view the same information that was visible from the existing server and everything continues to work just as before.
The power of combining access methods
In order for the decision to migrate global company data to a cloud file services platform to make business sense, it is critical to implement a global file services solution that utilises the benefits of all four access methods. Organisations must be able to continue working without disruption, just as they did prior to migration.
The importance of maintaining file structure
Users might have files that are stored in a particular folder structure. For example, employees may regularly work with a number of Excel spreadsheets that need to be maintained in the exact drive mappings as they were before. It would be very difficult for an organisation to migrate to a solution that does not include the ability to continue to serve this type of workload in this way.
In the unlikely event that a caching gateway fail, all that is needed is the installation of a second gateway, and the files are instantly restored intact.
Non-disruptive migration is an important capability of a cloud file services platform. It is especially important for organisations that have previously invested heavily in traditional systems and legacy architecture, which may become obsolete during an organisations’ digital transformation. The beauty of this approach is that enterprises can achieve the agility they are after, without having to start from scratch with the procurement of entirely new systems.
As enterprises grow in the global market and embrace digital transformation, more and more people are working remotely and expect data to be available to them wherever they are. It is vital to these users, and the companies they work for, that travel and remote office working equates to business as usual. With caching capability allowing for multiple gateways to a single global file services platform, organisational data can be quickly and securely accessed from desktops, laptops and mobile devices at any time, from anywhere.