‘Tiny AI’: the next big thing in intellectual property?

Advances in AI have changed the world in the last decade, whether indirect consumer-facing services like Amazon’s ‘Alexa’ or Apple’s ‘Siri’ which have become widespread, or behind the scenes with image processing; less glamorous inference; data processing; or control applications.

Whilst large scale necessarily cloud anchored applications will be here for the foreseeable future, cloud processing of AI requests places heavy demands on communications networks. Regardless of how communication technology advances, there will always be some issues of latency and reliability of connection.  In a mobile world and for real-time safety-critical applications – with autonomous vehicles being one notable application – this can still be an important practical barrier to greater adoption.

The world is, unfortunately, experiencing the effects of the Covid-19 crisis at the time of writing, which has, among other things, greatly reduced human travel and transport energy demands.  Data still travels, increasingly so, and a rising concern is the proportion of energy consumed by computing and communications.  Although data obviously costs a lot less energy to transport than a person, it still has a finite energy and infrastructure cost.  Rapidly growing individual streams of individual raw data from myriad mobile consumer applications of AI to and from data centres will hit bottlenecks.  ‘Locally sourced’ intelligence can mitigate this.

The historical shift between cloud versus local or edge and mixed processing in other areas of computing will be familiar to most readers.  You wouldn’t sensibly post a letter from the UK to California just to ask a question that the person at the post office counter could easily have answered for you.  Using a data centre a thousand miles away to identify a smile could be compared to using a sledgehammer to crack a nut – even if you still need to use a cloud solution for part of the application.  For many AI applications, an acceptable result can be obtained without needing the resources of a data centre, if a suitably trained processor is used locally.  This can avoid, or at least reduce, communication demands and data centre processing demands – thereby making more applications more widely available.

A lot has been said and written about patent protection for AI.  Speaking from my personal perspective as a patent attorney working in this area, I have frankly seen disproportionate hype around not much more than application of known techniques to a new data set, whereas some of the more interesting development (to me personally at least) is not really protectable for various reasons outside the scope of this article.  That is not to say there is no scope for valuable innovation in the area of ‘conventional’ AI, but do not expect every application to be protectable, and be prepared to engage creatively with someone to root out what is commercially worthwhile protecting.

That being said, I do see plenty of scope for interesting and protectable development in making tiny AI work well.  There are of course numerous applications which do work locally, but AI feeds on data.  Having plenty of data to train on is important, so gathering available new input to grow that dataset is an important part of bootstrapping knowledge and effectiveness.  This does not readily lend itself to individual smaller processors working on their own datasets.

Taking the right aspects of what can be done in the cloud and choosing to do them locally and then managing the interactions between local and cloud components still has a lot of unsolved problems.  Getting the most out of all the data available to the local device’s eyes or ears without simply shipping all of it back to base, and keeping distributed processors efficiently up to date with the latest intelligence, is likely to play an important part in next-generation consumer AI applications.

I see AI as just following the cycle of other industrial developments a step or two behind other industries: we are now looking at improving energy efficiency; miniaturisation; data supply chain logistics; and security (which will be an issue – e.g. rogue data corrupting training).  There is plenty of scope for innovation here as in other industries.

The backbone technology of AI is potentially valuable, and I predict numerous new companies will emerge to capitalise on niche areas, intellectual property (IP) being a key aspect of their strategy in order to maximise value and avoid simply being swallowed up or bypassed (and they may well be acquired based on that value).  Companies working in AI often have a lot of smart people working on getting the product working well as a priority (for understandable reasons), but some give a lower priority to looking ahead to actively maximising ultimate value realisation.  Creatively approaching the IP strategy is one component.  It is always satisfying to see a company that has developed a great product do well, and frustrating to see one who did ok but could have done better.  The key here would be for more companies to capitalise fully on the advancements they bring in giving AI more widespread and sustainable applications.

Website | + posts

Ilya has been consistently identified as a world-leading IP strategy expert by Intellectual Asset Management (IAM) for several years. Ilya advises companies ranging from start-ups to multinationals across a broad technology spectrum including IT and medical technology on IP strategy and obtaining, defending, opposing and litigating patents.

Unlocking Cloud Secrets and How to Stay Ahead in Tech with James Moore

Newsletter

Related articles

Leveraging AI for Business Transformation and Market Adaptability

Artificial Intelligence is changing the game for businesses everywhere....

How AI is Transforming Customer Communication Management

Business communication has evolved over the years. Today, it's...

Investment Opportunities for Startups and Technologies in AI 

Although artificial intelligence developed from niche technology has become...

Four Surprising Lessons I’ve Learned Leading Tech Teams

Techies. Geeks. Boffins. Whatever your organisation calls its IT...

A Business Continuity Cheat Sheet

Right, let's be honest. When you hear "business continuity,"...