How AI can improve battery cell R&D productivity

Everywhere you turn these days, there is news about the power of artificial intelligence (AI) to transform our lives – from diagnosing cancers to writing movie scripts. Now AI practitioners are turning their attention to electric vehicle (EV) batteries.  

Data-driven, machine-learning-based approaches have received much attention from both academia and industry over the past decade. Research into the use of AI has shown promise for accurately predicting the dynamics of nonlinear multiscale and multiphysics electrochemical systems. Everything from battery state of health (SOH) estimation,safety and risk prediction, to cycle life prediction and Battery Lifetime Prognostics, closed-loop optimisation of fast-charging protocols to identifying degradation patterns of lithium-ion batteries from impedance spectroscopy using machine learning.The predictive ability of AI is however challenged when it comes to EV lithium-ion battery research and development , where extensive datasets do not yet exist. This is particularly true when predicting the cycle life of the battery and the cells making up the pack. Small changes to the chemical and/or physical properties of the anode, cathode, electrolyte, and even the separator can have a significant impact on the battery’s performance.

The enormity of the task is better understood when the time required to predict a cell’s life through experimentation alone is considered. 

Due to the long service life of batteries, simulating the cycle life of the cell during operation in an EV would require a test regime of at least 500 cycles. Following the extreme fast charge, discharge, and rest cycle, would normally limit the number of cycles to about 17 per day. This means that for every iteration of even the smallest of changes it would take more than a full month of uninterrupted testing to predict the cycle life.

However, in practice it is not uncommon to conduct up to 200 experiments in parallel – often comprising groups of  8 cells or more.

Running these test procedures while recording all the measurements every second from every one of the thousands of independent battery test channels, creates an enormous amount of data. Even with the solution being programmed, automated, and cloud-based, allowing all these experiments to be conducted every week, the batteries would still need many months of testing to simulate the cycle life.

This is a time-consuming and expensive bottleneck in battery research.

When conducting their cutting-edge research into Extreme Fast Charging (XFC) battery technologies, researchers at StoreDot found they needed a better way to predict each battery’s end of life without losing valuable data insights and transparency.

By choosing to use Artificial Intelligence to do the ‘heavy-lifting’ StoreDot became one of the first battery-tech companies to implement AI in the R&D phase of EV battery development.

Deploying AI in battery R&D saves time and money 

Analysing the results obtained from the complex set of experiments carried out on diverse battery-cell chemistries and designs is very similar to clinical trials typically conducted on sample sets of patients. This lead the research team to investigate the possibility of merging the chemical battery’s world with the concept and methodology applied to these patient clinical trials.

Using Evolution Intelligence’s AI-for-AI platform, StoreDot built and optimised their models, which indicated that the Kaplan-Meier (KM) graphs and methodologies would  be ideally suited to investigating, learning, and predicting the battery lifetime. This would speed up the battery R&D cycle – in particular predicting the battery lifetime and, equally important – learning from the explainable algorithm what were its significant markers and metrics, a very significant and valuable lesson in understanding our unique chemistry and its boundries

The genetic KM algorithm takes into consideration the hundreds of “genes” originating from chemistry formulations, cell design decisions, and production process measurements. Coupled with aggregated and augmented test measurements, a forest of decision trees is created dynamically to explain the results.

This ad-hoc clustering method calculates the survivability of each battery  in the resulting tree leaf and uses that information to maximise the pre-determined parameters. In essence, as the machine learns and improves the tree leaves – making the batteries more similar to each other – the better the lifetime predictions become.

The resulting nodes in the trees are the algorithm’s insights about how to improve battery life and can be used to establish cell design parameters, manufacturing parameters, or specific timed measurements.

Thus, the lifecycle can be predicted with a 15 percent accuracy after having completed only 125 cycles, instead of 500-1500 – thereby freeing up resources and slashing the time needed for evaluation and decision-making. What is more, the algorithm allows researchers to cut any experiment short that does not meet the set objectives, whilst adding the successful candidate’s results to the database for inclusion in future experiments.

Moving to the next generation of predictors

As the project moves from R&D onto real life pilot lines, the importance of the transparency of the models becomes less important, while accuracy and early prediction points become crucial for the manufacturing and operability phases of the battery.

At this stage the KM graphs were replaced by tighter clustering algorithms that learn the highly-augmented time-series measurements of the battery’s life as it progresses.

With thousands of calculated trajectories of key performance indicators over the current lifetime of the battery, the algorithm optimises for statistical significance and chooses the most important X such vectors that hold most of the signal to accurately predict the remaining useful life of the pack.

Here, Evolution’s AI-for-AI platform was used to optimise and select thousands of “genetic algorithm” generations from millions of potential combinations.

The results allow researchers to accurately predict multiple future targets at earlier prediction points (for instance: cycles 32, 62, and 122 – equal to 2, 4 and 8 testing days, respectfully) with appropriately high accuracies.

In the Predicted vs Actual “Cycle @ retention 85%” graph below, the training set is shown in blue, while the newer, never-seen-by-the-algorithm, test set is in orange, demonstrating a high prediction accuracy.

Clearly, the benefit of AI and machine learning during the R&D process is proving invaluable in enabling researchers to evaluate the impact of changing more than one variable at a time, thereby speeding up the process to accurately determine a battery’s lifecycle.

However AI and machine learning’s ability to forecast battery life has other benefits to high-tech research companies such as StoreDot – it provides management, and investors, with a valuable tool to predict the viability of untried nascent technologies. 

AI used in EV battery R&D delivers other important hidden benefits

In the rapidly developing EV battery sector, with new breakthroughs being announced and delivered almost daily, it may be difficult to identify which technologies to pursue or invest time and money in. Even more so if you have a small team of researchers competing with large well-funded corporations.

Prediction and forecasting of key battery performance metrics in the early stages of a project can give  decision makers a more tangible indication of the likelihood of the success of a given idea, technology, or even business model. 

The prediction of key battery performance metrics in the early stages of a project can give management a tangible indication of the likelihood of a successful outcome.

In a landscape filled with exciting technologies and noble ideas, not all will eventually converge into a successful product or technology. Predictive results obtained through AI improve confidence in the chances of a successful convergence.

At the same time, the deployment of AI and machine-learning can often supplement hard-to-find skills and talent in battery R&D. This is particularly helpful in smaller organisations that are competing in the same space as the highly funded and well-established industry stalwarts. Effective implementation of AI gives these smaller operations a better ‘David and Goliath’ shot at leveling the playing field.


With the rapid advances being made in AI, the technology when applied to the EV battery industry is set to assume a pivotal role in reducing costs and improving performance. StoreDot’s R&D success in developing a cutting-edge extreme fast charging battery technology is proof of the effectiveness of AI in saving time and money previously spent on processes such as repetitive life cycle testing.  

However, even though AI is ideally positioned to revolutionise EV battery R&D, it has many other, equally profound applications in the all-electric vehicle industry.

Everything from managing sensitive global supply chains, to aggregating and evaluating data gathered from fleets through cloud computing to improve battery management. By creating smarter batteries with embedded sensing capabilities, and self-healing functionality, connected EVs’ battery management systems can continuously monitor the ‘state of health’, and even rejuvenate selected battery cells or modules if required. 

What is more, when enabled through the IoT and over-the-air updates, EV performance and safety can be enhanced to operate closer to the extremities of the envelope. Whether this comes in the form of optimising the range or reducing the time to charge – one thing is for sure: AI will play a key role in reducing “range anxiety” and speed up the adoption of BEVs.

+ posts

Adoram Rogel / Director of Data Science at StoreDot

Adoram is an experienced technical leader, software and cloud architect, evangelist and programmer at StoreDot. He uses data science to process, analyse and predict the performance and life cycle of ultra-fast charging batteries.

CIF Presents TWF – Professor Sue Black


Related articles

How Businesses Should Tackle Big Data Challenges

In today's data-driven landscape, Big Data plays a pivotal...

UK IP Benefits and How to Get One

There are many reasons why you may get a...

Navigating the Landscape of AI Adoption in Business

In today's rapidly evolving technological landscape, the integration of...

Three Ways to Strengthen API Security

APIs (Application Programming Interfaces) are a critical driver of...

A Comprehensive Guide To The Cloud Native Database [2024]

Databases are crucial for storing and managing important information....

Subscribe to our Newsletter