From the lab to production: Driving GenAI business success

Generative AI (GenAI) was the technology story of 2023. Spurred on by the breakaway success of ChatGPT, the likes of Amazon, Microsoft and Google have accelerated their own efforts, creating a tidal wave of innovation that promised to reshape the way businesses and users harness the power of technology to drive productivity. Itโ€™s already made significant strides in various sectors such as pharma and law. But what weโ€™ve seen so far is just the beginning. The true power of GenAI will only become clear once organisations take it out of the experimental stage and begin to use it more widely in production.

However, in order to ride the wave rather than get caught up in it, organisations must overcome some key challenges around cost and trust. Doing so will require a robust data roadmap that leverages the cloud.

Cost and trust are the biggest barriers

When it comes to GenAI, the old computing maxim of โ€œgarbage in, garbage outโ€ appliesโ€”you canโ€™t expect to generate useful results if the model is trained on untrustworthy data. The challenge is that data governance and security are still at a nascent stage in many organisations, with crucial information often locked away in silosโ€”making it effectively unusable without costly integration. In practice, this means that AI training data may be poor quality and lack crucial business context, which can lead to hallucinations (fictional information that seems realistic) or factual responses that lack the necessary context. Either way, it adds no value for the business.

Another critical pain point is the high cost of in-house GenAI projects. While outsourcing comes with its own security, compliance and potential risk, doing everything internally can be eye-wateringly expensive. A single cutting-edge GPUs, designed specifically for running large language models (LLMs), costs around $30,000. And an organisation wanting to run training a model with, say, 175 billion parameters might need 2,000 GPUs. Thatโ€™s a hefty bill in the region of tens of millions of dollars.

Taking GenAI from the lab to production

This is why cloud infrastructure is becoming increasingly popular as a foundation for AI. Cloud providers have the GPU resources to empower customers to scale their GenAI projects and only pay for what they use. This enables organisations to experiment with GenAI and turn off the model once theyโ€™ve finished tinkering, rather than having to provision GPU in on-premise environments. That saves on CapEx and provides the flexibility organisations need to take operations back in-house in the future if required.

Once theyโ€™ve decided to implement cloud, how can organisations get GenAI projects out of the lab and deliver value in production environments? The BRIESO model โ€“ Build, Refine, Identify, Experiment, Scale and Optimise โ€“ is instructive here:

Build: First, create a modern data architecture and universal enterprise data mesh. Whether on-premises or in the cloud, this will enable the organisation to gain visibility and control of its data. It will also help by establishing a unified ontology for mapping, securing and achieving compliance across all data silos. Look for tools which not only meet current demand but have the scalability to accommodate future growth. Open source solutions often offer the greatest flexibility.

Refine: Next, itโ€™s time to refine and optimise data according to existing business requirements. Itโ€™s important at this stage to anticipate future requirements as accurately as possible. This will reduce the chances of migrating too much unnecessary data, which will add no value but may increase the cost of the project significantly.

Identify: Spot opportunities to utilise cloud for specific workloads. A workload analysis will be useful here in helping to determine where most value could be derived. Itโ€™s about connecting data across locations โ€“ whether on-premises or in multiple clouds โ€“ to optimise the project. Now is also a good time to consider potential use cases for development.

Experiment: Try pre-built, third-party GenAI frameworks to find the one that best aligns with business requirements. There are plenty to choose from, including AWSโ€™s Bedrock (Hugging Face), Azureโ€™s OpenAI (ChatGPT) and Googleโ€™s AI Platform (Vertex). Itโ€™s important not to rush into a decision too early. The model must integrate closely with existing enterprise data for the project to stand any chance of success.

Scale and Optimise: Once a suitable platform is chosen, consider picking one or two use cases to scale into a production model. Continuously optimise the process, but keep an eye on GPU-related costs in case they start to spiral. As the organisationโ€™s GenAI capabilities start to grow, look for ways to optimise their use. A flexible AI platform is crucial to long-term success.

The future is here

IT and business leaders are understandably excited about the transformative potential of GenAI applications. From enhanced customer service to seamless supply chain management and supercharged DevOps โ€“ itโ€™s no surprise that 98% of global executives agree AI foundation models will play an important role in their strategy over the coming 3-5 years.

But before anyone gets too carried away, thereโ€™s still plenty of work to be done. A modern data architecture must be the starting point for any successful AI project. Then itโ€™s time to refine, identify, experiment, scale and optimise. The future awaits.

Paul Mackay
+ posts

Paul Mackay is Regional Vice President Cloud - EMEA & APAC at Cloudera.

Unlocking Cloud Secrets and How to Stay Ahead in Tech with James Moore

Newsletter

Related articles

Understanding the cloud adoption curve and what the future holds

Globally, strong cloud adoption trends are well established, with...

AI Build or Buy and the Death of Billable Hours

"The billable hour has been a universal system applied...

Optimising Cloud Cost Management to Maximising ROI

A businessโ€™s cloud infrastructure needs will evolve with its...

Welcome to More Productive, AI-powered Working Lives

According to content services expert Dr. John Bates, AI...

Cloud Security Challenges in the Modern Era

Organisations already have to store files and data in...