Shopping for Data: Ensuring a seamless user experience 

More and more companies today are migrating their data to the cloud. With lower operating costs, flexibility of infrastructure, new data governance possibilities and fewer data silos, the cloud data experience allows businesses to easily access their data whenever they need to.

Or at least that is how it is meant to work in theory. In reality, many businesses are still unable to access specific data when they need it most, despite having spent years collecting data. Too often, businesses lack the internal data habits needed to create a well-rounded and more accessible data infrastructure.

As a result, businesses looking to fast track their migration to the cloud are still facing the threat of slow data scaling due to a lack of data access control policies. How can businesses optimise their data strategy to ensure they provide employees with the access they need to analyse, understand and trust data, and most importantly, make reliable business decisions from that data.

Data shopping: a business’s ‘grocery store’

The success of any data strategy largely relies on the ability to understand, trust and utilise the data collected. Unfortunately, this tends to be where many projects fail, as businesses lack a solid long-term plan to increase data literacy. Research conducted by IDC and Collibra found that 67% of businesses said having access to intelligence about the data being used to make decisions is crucial to success. Yet over 65% of organisations face challenges with identifying and managing their data sources.

To battle these challenges, one approach to data use and management that has been gaining popularity across the industry is the concept of data marketplace. This data marketplace serves as a digital shop window for all of a company’s data, which employees can use to search, find and extract specific data records.

The appeal of the data marketplace is that it offers a data shopping experience that simulates classic online shopping. A user can filter their search for a desired dataset using different categories and specifications in the same way they would search for a product on Amazon, and then ‘check-out’ the data they need . This data shopping experience is designed to provide an easy-to-use approach, opening up employee access to all the data they need and placing it at their fingertips.

However, creating a data marketplace is only useful if an organisation can encourage staff to use it. If it does not offer easy access and a seamless experience, employees will not adopt this solution. So, how can organisation’s design an effective data shopping experience?

Data accountability: keeping the shelves stocked

It is easy to assume that all data is good data, but just because data is discoverable does not guarantee it to be relevant or of good quality. Data quality can have a significant impact on the successful integration of data shopping across the business: if employees can only find ‘bad quality’ data which is unlikely to be relevant or useful, they are far less likely to use the provided solution.

The general rule of thumb is, no matter how great a shop looks, if it’s full of items you don’t want or need then the shopping experience will be poor. To use the Amazon analogy again, if you’re looking for a product and see a bunch of poor reviews, you’re not likely to buy it. Data is no different and poor quality data is unlikely to get ‘checked-out’. Business leaders should take accountability for the data being put into the marketplace and ensure that the right data is available in the right form at the right time.

But as well as keeping on top of what goes into the data marketplace, businesses must be cautious that data — especially personal data — is handled with discretion and is subject to further duty of care.

Identifying and protecting sensitive data

One of the key challenges in data management is the ability to regulate the access to and use of sensitive data. Privacy regulations call for extra caution when handling sensitive data such as personally identifiable information or protected health information.

It is right that businesses implement protections for sensitive data to control who can access it and for what purpose. Failing to protect sensitive information can lead to data breaches and regulatory fines for the company.

However, exerting too much control and reducing access to all data will limit the value that data can generate; employees could be left with no access to data when it matters most.

As such, businesses migrating to the cloud need to find the right balance between setting up strong access control policies without slowing the ability to scale data access across the organisation.

The three-step approach

Businesses looking to ensure compliant access to data and analytics in the cloud must overcome these common challenges associated with data governance and data quality in order. Here is a three-step approach to guide businesses through this process.

First, divide and conquer. The business must distribute responsibility for systematising data to specific departments. Doing so when moving data to the cloud prepares a business for compliant use. Most businesses have been collecting data for years, meaning that most data ecosystems have evolved through a wide variety of generations. Therefore, it is essential to have a complete understanding and overview of the company’s data and its sources, both past and present.

Next, prepare for the big move. A key strategy here should be universal data authorisation. Using the right tools and solutions to regulate data use, build context and identify sensitive data is essential. Businesses should look to simplify, standardise and automate access controls and policies throughout this process. Tools can make it simple and effective for businesses to move their data. For instance, they can be used throughout the migration process to enable the creation of new policies and push dynamic use at scale.

Finally, apply discretion. A data catalogue can help structure and keep track of the data management process in the cloud end to end. The metadata, data classifications and technical metadata would essentially feed the access management engine and data pipelines, enabling policy-driven access to sensitive data. When the data catalogue is populated with shared records it should be ready for access and universally available. This data catalogue can be used as the basis for the data marketplace. It is also important for businesses to prepare policy-driven strategies for handling sensitive data.

The check out

If these steps are done correctly, businesses can safely migrate their data to the cloud and offer employees a streamlined data shopping experience. Then they will be able to access and discover available data, as well as transparently see data content, including metrics and compare data without the risk of breaching data protection rules. Importantly, employees will be able to trust their data, leading to better informed business decisions that truly move the needle on business outcomes and deliver strong ROI on their data shopping initiatives.

This combination can drive a business’s data culture and provide a structured approach for businesses to benefit from data intelligence across their operations, with only a few clicks.

+ posts

As EVP of Global Sales, Mark heads up Collibra's sales, business development, sales operations and sales enablement functions across all geographies and all industries. Mark has over 30 years of experience in deploying, managing, consulting and selling data technologies such as BI, Data Warehousing, Analytics, CRM and AI where his focus has been to enable some of the worlds' largest organisations to become truly data driven, create real value and tangible business outcomes, and deliver strong ROI. Mark has been with Collibra for over 3 years and has overseen our sales business first in EMEA, then APAC, and now at a global level. Prior to joining Collibra, Mark was responsible for Analytics and AI sales in EMEA at Salesforce. Mark has previously held leadership positions at Bluewolf, Atex, NTT Data and Morse.

CIF Presents TWF - Miguel Clarke

Newsletter

Related articles

Generative AI and the copyright conundrum

In the last days of 2023, The New York...

Cloud ERP shouldn’t be a challenge or a chore

More integrated applications and a streamlined approach mean that...

Top 7 Cloud FinOps Strategies for Optimising Cloud Costs

According to a survey by Everest Group, 67% of...

Eco-friendly Data Centres Demand Hybrid Cloud Sustainability

With COP28’s talking points echoing globally, sustainability commitments and...

The Path to Cloud Adoption Success

As digital transformation continues to be a priority for...

Subscribe to our Newsletter