From EV batteries to solar cells to microchips, new materials can supercharge technological breakthroughs. But discovering them usually takes months or even years of trial-and-error research. 

Google DeepMind hopes to change that with a new tool that uses deep learning to dramatically speed up the process of discovering new materials. Called graphical networks for material exploration (GNoME), the technology has already been used to predict structures for 2.2 million new materials, of which more than 700 have gone on to be created in the lab and are now being tested. It is described in a paper published in Nature today. 

Alongside GNoME, Lawrence Berkeley National Laboratory also announced a new autonomous lab. In partnership with DeepMind, the lab takes GNoME’s discoveries and uses machine learning and robotic arms to engineer new materials without the help of humans. Google DeepMind says that together, these advancements show the potential of using AI to scale up the discovery and development of new materials.

GNoME can be described as AlphaFold for materials discovery, according to Ju Li, a materials science and engineering professor at the Massachusetts Institute of Technology. AlphaFold, a DeepMind AI system announced in 2020, predicts the structures of proteins with high accuracy and has since advanced biological research and drug discovery. Thanks to GNoME, the number of known stable materials has grown almost tenfold, to 421,000.

“While materials play a very critical role in almost any technology, we as humanity know only a few tens of thousands of stable materials,” said Dogus Cubuk, materials discovery lead at Google DeepMind, at a press briefing. 

Related work from others:  Latest from Google AI - Alternating updates for efficient transformers

To discover new materials, scientists combine elements across the periodic table. But because there are so many combinations, it’s inefficient to do this process blindly. Instead, researchers build upon existing structures, making small tweaks in the hope of discovering new combinations that hold potential. However, this painstaking process is still very time consuming. Also, because it builds on existing structures, it limits the potential for unexpected discoveries. 

To overcome these limitations, DeepMind combines two different deep-learning models. The first generates more than a billion structures by making modifications to elements in existing materials. The second, however, ignores existing structures and predicts the stability of new materials purely on the basis of chemical formulas. The combination of these two models allows for a much broader range of possibilities. 

Once the candidate structures are generated, they are filtered through DeepMind’s GNoME models. The models predict the decomposition energy of a given structure, which is an important indicator of how stable the material can be. “Stable” materials do not easily decompose, which is important for engineering purposes. GNoME selects the most promising candidates, which go through further evaluation based on known theoretical frameworks.

This process is then repeated multiple times, with each discovery incorporated into the next round of training.

In its first round, GNoME predicted different materials’ stability with a precision of around 5%, but it increased quickly throughout the iterative learning process. The final results showed GNoME managed to predict the stability of structures over 80% of the time for the first model and 33% for the second. 

Related work from others:  Latest from MIT Tech Review - Intro to AI: a beginner’s guide to artificial intelligence from MIT Technology Review

Using AI models to come up with new materials is not a novel idea. The Materials Project, a program led by Kristin Persson at Berkeley Lab, has used similar techniques to discover and improve the stability of 48,000 materials. 

However, GNoME’s size and precision set it apart from previous efforts. It was trained on at least an order of magnitude more data than any previous model, says Chris Bartel, an assistant professor of chemical engineering and materials science at the University of Minnesota. 

Doing similar calculations has previously been expensive and limited in scale, says Yifei Mo, an associate professor of materials science and engineering at the University of Maryland. GNoME allows these computations to scale up with higher accuracy and at much less computational cost, Mo says: “The impact can be huge.”

Once new materials have been identified, it is equally important to synthesize them and prove their usefulness. Berkeley Lab’s new autonomous laboratory, named the A-Lab, has been using GNoME’s discoveries, integrating robotics with machine learning to optimize the development of such materials.

The lab is capable of making its own decisions about how to make a proposed material and creates up to five initial formulations. These formulations are generated by a machine-learning model trained on existing scientific literature. After each experiment, the lab uses the results to adjust the recipes.

Researchers at Berkeley Lab say that A-Lab was able to perform 355 experiments over 17 days and successfully synthesized 41 out of 58 proposed compounds. This works out to two successful syntheses a day.

Related work from others:  Latest from MIT : Jackson Jewett wants to design buildings that use less concrete

In a typical, human-led lab, it takes much longer to make materials. “If you’re unlucky, it can take months or even years,” said Persson at a press briefing. Most students give up after a few weeks, she said. “But the A-Lab doesn’t mind failing. It keeps trying and trying.”

Researchers at DeepMind and Berkeley Lab say these new AI tools can help accelerate hardware innovation in energy, computing, and many other sectors.

“Hardware, especially when it comes to clean energy, needs innovation if we are going to solve the climate crisis,” says Persson. “This is one aspect of accelerating that innovation.”

Bartel, who was not involved in the research, says that these materials will be promising candidates for technologies spanning batteries, computer chips, ceramics, and electronics. 

Lithium-ion battery conductors are one of the most promising use cases. Conductors play an important role in batteries by facilitating the flow of electric current between various components. DeepMind says GNoME identified 528 promising lithium-ion conductors among other discoveries, some of which may help make batteries more efficient. 

However, even after new materials are discovered, it usually takes decades for industries to take them to the commercial stage. “If we can reduce this to five years, that will be a big improvement,” says Cubuk.

Similar Posts