Using Deep Learning to Assess Lithium Metal Battery Performance
Perlmutter GPUs Aid Development of Unique Computer Vision Algorithm
July 14, 2023
By Keri Troutman
Contact: cscomms@lbl.gov
In the ongoing quest to develop new battery designs, scientists rely on highly accurate assessment tools so they can understand defects and track performance. Solid-state lithium metal batteries have the potential to revolutionize the energy storage industry by providing safer, longer lasting, and higher performance energy storage solutions than traditional lithium batteries. However, developing stable and efficient solid-state electrolytes that can withstand the demanding operating conditions of batteries remains a significant technical hurdle.
This challenge prompted a team of researchers from Lawrence Berkeley National Laboratory’s Center for Advanced Mathematics for Energy Research Applications (CAMERA) to work with colleagues to develop batteryNET, a deep learning algorithm that enables unprecedented assessment of lithium agglomeration in solid-state lithium metal batteries. An overview of batteryNET, which uses a neural network to track morphologies that appear in batteries over time, was recently featured in a paper published in the Nature Partner Journal (NPJ) Computational Materials.
“The ability of computer vision algorithms like batteryNET to seamlessly transition across scientific domains holds great potential for moving a wide range of research forward,” noted co-author Daniela Ushizima, a staff data scientist at CAMERA who leads CAMERA’s Computer Vision team. Other Berkeley Lab researchers and affiliates who are co-authors on the paper include Jerome Quenum, Dula Parkinson, and former employee David Perlmutter, now at Apple. Additional co-authors include Iryna Zenyuk, Ying Huang, and Andrea Fei-Huei Su of the University of California, Irvine; and Pavel Shevchenko of Argonne National Lab’s Advanced Photon Source.
Quality Control is Key
Unlike traditional lithium-ion batteries that contain a flammable electrolyte, solid-state lithium metal batteries use solid electrolytes to provide a more stable interface with lithium metal, making them less likely to catch fire or explode. They can also store more energy per unit weight or volume than conventional lithium-ion batteries, which could lead to lighter, longer-lasting, and more powerful devices.
“These batteries promise superior performance, but quality control is key to deploying safe designs,” Ushizima said.
One critical element in assessing lithium battery performance and safety, especially through multiple charge and depletion cycles, is to systematically measure the growth of dendritic structures, which can have huge consequences for material properties. During cycling, lithium ions travel back and forth, but due to electrochemical instabilities the lithium becomes less mobile and starts depositing at the surface of one of the electrodes. As these deposits increase, the resulting dendritic structure can penetrate the electrolyte, eventually short-circuiting the battery.
To address this issue, batteryNET leverages residual U-Nets, a hierarchical deep-learning architecture for semantic segmentation – a computer vision task designed to categorize each pixel in an image into a class or object. This allows users to turn high-resolution X-ray micro-computed tomography (micro-CT) data into properties about the electrodes, electrolyte, and lithium distribution that can then be used to inspect batteries for longer duration and safety. batteryNET’s semantic segmentation approach yields an array of measurements about the component’s changes, providing insights that can significantly benefit future battery design, as well as a computational model that generalizes with an unseen dataset, noted Ushizima.
batteryNET also addresses the vanishing gradient problem often found in deep learning: as more layers are added to the neural network, network training gets bogged down in trying to make big decisions from tiny differences in numbers. By introducing residual connections – which provide another path for data to reach latter parts of the neural network by skipping some layers, enabling the flow of information directly from one layer to another – batteryNET helps the network alleviate the degradation of information across deeper layers.
“We have not only a snapshot but the whole evolution of the dendrites during a full cycle, so we can see that, at a certain amount of time and charge, problems start,” said Ushizima. “Detecting and tracking dendrite formations can inform materials scientists about optimal electrode-electrolyte compositions and interactions. What we’re doing now is capturing lithium agglomeration and respective morphologies to track the dendrite growth.”
Leveraging Iterative Training Protocols
For the paper’s study, the researchers used 3D datasets gathered from micro-CT tomography experiments conducted at Berkeley Lab’s Advanced Light Source and Argonne National Lab’s Advanced Photon Source. These datasets were then employed on the Perlmutter supercomputer at the National Energy Research Scientific Computing Center (NERSC) to train artificial intelligence models and identify characteristics vital to improving battery performance. At around 5 gigabytes of data per frame, training the models required considerable computational power and data storage.
“NERSC’s Perlmutter supercomputer was able to handle these large datasets, and the GPUs [Graphics Processing Units] were ideal for this algorithm,” said Ushizima.
Ushizima initially embarked on this research in March 2020 with Zenyuk, director of the National Fuel Cell Research Center at the University of California-Irvine. Zenyuk led the effort to prepare the samples and compile the micro-CT datasets for the project. After much analysis and collaboration, the batteryNET algorithm was designed, trained, validated, tested, and iteratively fine-tuned.
“The incorporation of lithium metal anodes in new designs holds promise for advancing battery technologies,” said Zenyuk. “Studies like the one we reported in NPJ Computational Materials combine state-of-the-art material characterization and computation that are much needed for facilitating the successful implementation of emerging technologies."
“We envision these deep learning algorithms being further adapted to support materials science investigations, leveraging iterative training protocols and the utilization of multiple GPUs at NERSC,” added Ushizima. “This will enable us to explore diverse areas, from lithium protrusions in batteries to analyzing imperfections in materials for electrolyzers.”
This work received support from the National Science Foundation under CBET Award 1605159 and was supported by projects at Berkeley Lab funded by the U.S. Department of Energy (DOE) ASCR and BES programs, Office of Science, under Contract No. DE-AC02-05CH11231. This research used resources of the Advanced Photon Source (APS), a DOE Office of Science user facility operated for the U.S. DOE Office of Science by Argonne National Laboratory under contract No. DE-AC02-06CH11357. The Advanced Light Source is supported by the Director, U.S. Office of Science, Office of Basic Energy Sciences, under Contract No. DE-AC02-05CH11231. This research used resources from NERSC, a DOE Office of Science user facility located at Lawrence Berkeley National Laboratory.
About NERSC and Berkeley Lab
The National Energy Research Scientific Computing Center (NERSC) is a U.S. Department of Energy Office of Science User Facility that serves as the primary high performance computing center for scientific research sponsored by the Office of Science. Located at Lawrence Berkeley National Laboratory, NERSC serves almost 10,000 scientists at national laboratories and universities researching a wide range of problems in climate, fusion energy, materials science, physics, chemistry, computational biology, and other disciplines. Berkeley Lab is a DOE national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California for the U.S. Department of Energy. »Learn more about computing sciences at Berkeley Lab.