The 4 steps towards transforming your business with copy data virtualisation

As terabytes turn into petabytes, the surge in data quantities has sent costs spiralling. At this current rate of production, by the end of this year the world will be producing more digital information than it can easily store.

[easy-tweet tweet=”Copy #data virtualisation can free an organisation’s data from its legacy physical infrastructure”]

This deluge of data caused by increased connectivity poses a serious challenge. The sheer amount of data quantities being created is causing the complexity and cost of data management to skyrocket. By 2020, IDC predicts that 1.7 megabytes of new information will be created every second for every human being on the planet. Trying to make sense of this data is going to be a huge challenge for all organisations.

So, why does this problem exist?

It’s down to the proliferation of data copies – multiple copies of the same thing or outdated versions. Consumers make many copies of data: on backup drives, multiple devices and cloud storage. Businesses are worse because of the need to maintain copies for application development, regulatory compliance, business analytics and disaster protection. IDC estimates that 60 per cent of what is stored in data centres is actually copy data, costing companies worldwide as much as $44 billion to manage.

The increase in copy data also poses a significant security threat. A recent IDC study found that the typical organisation holds as much as 375 data copies. Each added copy increases an organisation’s “area of attack”, and gives hackers looking to get at important information, more material to work with.

Right, so what is the way forward?

You’ve heard of server virtualisation. You’ve heard of network virtualisation. But have you considered virtualising your data yet? Copy data virtualisation – the process of freeing an organisation’s data from its legacy physical infrastructure – is increasingly what forward-thinking companies are doing to tackle this problem. By eliminating copy data and creating a single ‘golden master’, virtual copies of ‘production quality’ data are available immediately to everyone in the organisation that needs it.

For IT managers, copy data virtualisation could be the way to address the ever-increasing rise in data. Like any significant overhaul or change, implementing it across an organisation requires planning and strategic thinking.

I believe there are four key steps for businesses to follow to maximise the copy data virtualisation opportunity. Here they are:

  1. Your choice of platform

Every organisation faces its own set of challenges and this is what will impact on the choice of platform. Whilst this choice has to suit the needs of each individual business, there is a set of criteria commonly used. A typical enterprise will have workloads across a number of different systems – virtual machines on VMware and physical machines on Windows for example. The platform of choice has to be able to support of all of systems, as well as databases and applications. This is all a must for copy data virtualisation to take effect.

There are more considerations too. The platform should be infrastructure-independent to allow for different use cases. A single point location will make it simple to control and a platform with support for hybrid cloud offers the ability to take different applications into alternative data centres.

  1. The initial use case

IT departments adopt a number of overlapping technologies, such as software for backups, snapshots, disaster recovery, and more. Data virtualisation removes the need for all of these redundant technologies by creating virtual, on-demand data copies that suit a number of these uses cases with one platform. However, this doesn’t happen instantly and nor should it if you want your move to copy data virtualisation to be successful.

Choose one use case initially and roll it out first. In doing this you’ll be able to iron out any issues that come up before a wider roll out.

  1. What exactly do you need to consider?

So, your platform has been chosen and an initial use case identified – what next? Now it’s the time to understand your specific needs so you can design the underlying infrastructure to support accordingly. You need to be asking some important questions:

  • What rate is the production data changing at?
  • What is the retention time required for virtualising backup?
  • How many virtual data copies do you need at any one time?
  • What testing will be done with that data (performance, functionality, scaling etc.)?
  • How much bandwidth do you need (especially important if you’re working with multiple data centres across different locations)?
  • How is your data being replicated and encrypted?

It’s pivotal that you’re able to answer all of these questions before you start investing in infrastructure – it will save you a lot of time and money.

  1. Hybrid cloud

Lots of organisations have begun harnessing both private and public cloud offerings to create a hybrid cloud infrastructure. These hybrid clouds adopt the control and security of a private cloud, along with the flexibility and low cost of public cloud offerings. Working together, they can give organisations a powerful solution to meet the increased demands on IT from the rest of the organisation.

One of the main benefits of this hybrid cloud approach is enhanced agility – using public cloud means you can experience fewer outages and less downtime. Using a private cloud is good for the testing and development of new applications before deciding on where you’d like to host applications permanently. A hybrid approach also allows you to use multi-purpose infrastructure – for data recovery and test and development simultaneously, for example – helping to cut down on costs and complexity.

By implementing copy data virtualisation and reducing physical copies of data sooner rather than later, organisations will have to spend less on storage and can get to the most important stage of data management quicker – analysis. The results are wide-ranging, from less data moving across networks, less stored data, significantly reduced storage costs, and the total removal of costly operational complexity.

[easy-tweet tweet=”Copy #data virtualisation and reducing physical copies of data, means organisations spend less on storage”]

We like to view data virtualisation as a way to give an organisation virtual sanity.

Ash Ashutosh, CEO, Actifio
Ash Ashutosh brings more than 25 years of storage industry and entrepreneurship experience to his role of CEO at Actifio. Ashutosh is a recognized leader and architect in the storage industry where he has spearheaded several major industry initiatives, including iSCSI and storage virtualization, and led the authoring of numerous storage industry standards. Ashutosh was most recently a Partner with Greylock Partners where he focused on making investments in enterprise IT companies. Prior to Greylock, he was Vice President and Chief Technologist for HP Storage.
Ashutosh founded and led AppIQ, a market leader of Storage Resource Management (SRM) solutions, which was acquired by HP in 2005. He was also the founder of Serano Systems, a Fibre Channel controller solutions provider, acquired by Vitesse Semiconductor in 1999. Prior to Serano, Ashutosh was Senior Vice President at StorageNetworks, the industry’s first Storage Service Provider. He previously worked as an architect and engineer at LSI and Intergraph.

Ashutosh remains an avid supporter of entrepreneurship and is an advisor and board member for several commercial and non-profit organizations. He holds a degree in Electrical Engineering and a Masters degree in Computer Science from Penn State University.

AI Readiness - Harnessing the Power of Data and AI

Newsletter

Related articles

CIOs and CISOs Battle Cyber Threats, Climate, Compliance

CIOs and CISOs face unrelenting pressure from three massive...

Discover the Power of On-premise Cloud Innovation

For most organisations, the shift from on-premise to the...

The AI Show – Episode 8 – Theo Saville

In episode 8 of the AI Show, our host...

The Data Conundrum: How sustainable is its future?

In this article, Dan Smale, Senior Service Owner of...

Adopting open architecture for robust data strategy

As the world's economy grapples with continuous challenges and...