Machine learning: A rationalist’s view

Given the volume of column inches devoted to artificial intelligence in today’s media, it is all too tempting to treat it as a revolutionary concept. In reality, the concept is anything but new. The historical record shows that the term was first used by computer scientist and pioneer John McCarthy at an American technology conference in 1956. This fact isn’t meant to dissuade those who have faith in the undeniable potential of AI to transform our world – because it surely will – but to encourage a degree of measure, reason and patience in the discussion.

First and foremost, all of us who work in the technology world have a responsibility to our customers and the wider public to be clear about what we mean when we say ‘AI’. As a recent editorial from The Guardian points out, the term is commonly used to describe what might more accurately be referred to as machine learning: the process of discovering patterns in datasets to improve a system’s intelligence.

Today, machine learning impacts the minutiae of our daily lives in more ways than we may care to imagine. Big brands like Apple and Nestlé are investing huge sums of money in developing AI and machine learning models to capitalise on the kind of in-depth data that helps form a sophisticated picture of consumer choices and habits, aiding their product development and advertising strategies. Ads that pop up on your web browser are anything but random. Hugely popular online streaming services like Netflix and Amazon Prime use similar AI bots to curate a viewing experience that you’ll enjoy, recommending films and TV series by remembering what you’ve previously watched. Do you ever wonder why a TV series that you’ve never heard of has a score of “95%” on Netflix? This isn’t a user rating, but the system playing matchmaker, as the machine learning bots learn your viewing patterns and then make a calculated judgement that you’ll enjoy it.

In truth, Netflix belongs to a powerful new generation of businesses whose online models are contingent on their ability to shape the services they provide around the behaviours and preferences of their customers. For modern behemoths such as Amazon, eBay, Uber and Deliveroo – with their unprecedented market valuations – this is the new norm. But the reality is that these companies, though ubiquitous, remain the exception and not the rule.

While firms such as PwC estimate that AI could add a staggering $15.7 trillion to the global economy by 2030, research from Microsoft just last year found that most do not have an adequate AI strategy in place to take advantage of this opportunity. The challenge lies in the data. It doesn’t really matter how smart or sophisticated the AI bots are. If they are being fed bad or insufficient volumes of data, they will be unable to produce any kind of meaningful insight. This is, in part, what Mars Inc’s chief digital officer, Sandeep Dadlani, means when he says that “AI is still in its infancy in terms of solving problems”. Organisations haven’t even scratched the surface of what’s possible because they still do not possess the data that will make a real difference. In order for machine learning to do its job, it needs datasets that are broad, consistent and unbiased. The phrase ‘garbage in, garbage out’ was first used in 1957, just one year after McCarthy coined ‘AI’, but it is more relevant now than ever before. In fact, IBM has estimated that ‘bad data’ could be costing organisations as much as $3.1 billion in the US alone.

One solution is greater investment in the digital talent gap. As organisations move more administrative and sales processes online, they have no choice but to recruit a higher number of computer programmers, web developers and data analytics experts. This is already taking effect. Harvard Business Review described ‘data scientist’ as the sexiest job of the 21st century seven years ago, while online recruitment firm Glassdoor has named the “best job” in America for the past four consecutive years.

All of this suggests that demand for intelligent tools that improve data analytics and governance will only increase, as organisations streamline corporate information architecture with tools that help to embed machine learning into business operations, ultimately providing a simplified, richer customer experience. This is why Mitra has recently partnered with Amazon Web Services Marketplace for Machine Learning. Our business has developed a suite of machine learning products to help democratise AI’s capabilities, giving a growing number of organisations the capability to turn their data into genuine insight. Firstly, however, businesses must plant the necessary foundations by looking at the type and volume of data they are collecting, and the objectives they are trying to achieve as a business.

+ posts

CIF Presents TWF – Andrew Grill

Newsletter

Related articles

6 Ways Businesses Can Boost Their Cloud Security Resilience

The rise in cloud-based cyberattacks continues to climb as...

Good, Bad and the Ugly of Cybersecurity GenAI

As the cyber threat landscape continues to evolve at...

Maximising the business value of data

In today's volatile economic and geopolitical climate, companies must...

The cloud: a viable option for data storage

Cloud-first strategies have become commonplace across many industries. In...

Emerging trends in Cloud, DevOps and Governance

The cloud landscape has an immense impact on how...

Subscribe to our Newsletter