2019 is already proving to be a banner year for the big data industry – no question. We’ve seen cloud migration, a critical factor for many big data projects, really increase over the past year. Hybrid cloud models are becoming a very well-trodden path for the enterprise able to stitch together their increasingly complex and business-critical data pipelines with speed and reliability and cost-effectiveness.

For the industry, worldwide revenues for software and services around big data were projected to increase from $42B in 2018 to $103B in 2027, according to Wikibon. And according to an Accenture study, 79 per cent of enterprise executives agree that companies not embracing big data would lose their competitive position and face extinction.

Cloud has begun to stake its claim as a staple of this in 2019. As data delivery options merge and enterprises seek the scalability of platforms that help them achieve their goals, they also tackle a skills gap dilemma that isn’t closing anytime fast. In fact, with expert resources thin on the ground and commanding high salaries, with both automation and AI breaking out and becoming an essential component to delivery and operations.

The talent gap within DevOps and big data have rapidly become a barrier to growth and efficiency of analytic operations. In a recent survey, conducted for Unravel Data by Sapio Research, one in three enterprise business and IT decision makers revealed that one of the biggest pain points was talent scarcity. Perhaps by causation, 34 per cent claimed it takes too long to get to insight too.

With this premise proving true, we will start to see AIOps converging with DevOps as a top priority this year. For the enterprise, data is funnelled into training and improving AI-oriented applications and their development and delivery. Where the algorithm can take the strain of an over-stretched employee, it significantly speeds efficiencies in the business proving to be one of the data landscapes biggest volume drivers. No more opening support tickets and waiting patiently for a hounded DevOps to fault, find and diagnose an issue. Instead, businesses are looking forward to efficiency in both process and output with improving and maintaining the operations the focus.

The stack has become ungovernable for many enterprises. This is not a problem that can be solved by throwing more resources at the challenge if the issue lies in   DataOps to stay both efficient and effective. All data-reliant enterprises are putting time and money into finding a way to stitch the fabric of the data stack together to secure the goal of extracting fast, usable insights from their BI and various data sources. A case in point, Unravel’s research showed that although 84 per cent of respondents claim that their big data projects usually deliver on expectations, only 17 per cent currently rate the performance of their big data stack as ‘optimal’.

This is when application performance management (APM) is utilised in the big data stack – just how it is within other stacks. APM for the data stack will ensure that the final data outputs are in line with guidance at the right level – and that all the processes and individual components are optimised throughout the process. From code level performance profiling to customer application and application log data, optimisation makes ideal performance more than a dream.

Given that many enterprises use the cloud along with existing on-premise systems to run dozens of applications, the monitoring and troubleshooting issues of the DevOps team monitoring them all can be much more difficult than in other areas that fall under the DevOps remit.

For a Chief Data Officer – or data-responsible person – with their eye on consolidating gain made in 2018, the allure of fixing the avalanche of niggles that gives the data stack a precarious wobble will be very high in 2019.

 | Website

Kunal Agarwal, CEO and co-founder, Unravel Data - Kunal Agarwal co-founded Unravel Data in 2013 and serves as CEO. Mr. Agarwal has led sales and implementation of Oracle products at several Fortune 100 companies. He co-founded Yuuze.com, a pioneer in personalized shopping and what-to-wear recommendations. Before Yuuze.com, he helped Sun Microsystems run Big Data infrastructure such as Sun's Grid Computing Engine. Mr. Agarwal holds a bachelors in Computer Engineering from Valparaiso University and an M.B.A from The Fuqua School of Business, Duke University.

AI Readiness - Harnessing the Power of Data and AI

Newsletter

Related articles

Three Questions to Realign Your IT Investment Strategy

Speed is usually the silver bullet to remaining competitive...

Defending Health and Social Care from Cyber Attacks

The National Cyber Security Centre (NCSC) recently calculated that...

CIOs and CISOs Battle Cyber Threats, Climate, Compliance

CIOs and CISOs face unrelenting pressure from three massive...

Discover the Power of On-premise Cloud Innovation

For most organisations, the shift from on-premise to the...

The AI Show – Episode 8 – Theo Saville

In episode 8 of the AI Show, our host...