- American Physical Society Sites
- Meetings & Events
- Policy & Advocacy
- Careers In Physics
- About APS
- Become a Member
By Sophia Chen
APS March Meeting 2017 — Materials scientists are like chefs. "You shake and you bake, and your materials come out," says Wenhao Sun of Lawrence Berkeley National Laboratory (LBNL). But historically, they take forever to serve dinner. A researcher developing a semiconductor for a solar cell might spend three years tweaking different chemical compositions before even settling on a recipe.
Over the past decade, to speed up this process, materials scientists have begun to rely on computational methods to predict a material’s properties in advance. Researchers discussed progress in the field at this year’s March Meeting, which included 13 sessions on the theme of "Computational Discovery and Design of New Materials."
"There are so many sessions this year, and so many young people attending these sessions," says Turab Lookman of Los Alamos National Laboratory. "It’s very exciting."
Lookman credits the explosion of activity in part to the Materials Genome Initiative, a government-led consortium that began in 2011 under President Obama. On average, it takes 30 years for a new material to be discovered in the lab and deployed in a commercial product. The initiative’s goal is to cut this time in half. Federal agencies such as the National Science Foundation, the departments of Energy and Defense, and NASA participate by calling for proposals specifically related to materials design.
"Designing" materials is relatively new territory. Conventionally, scientists happen upon the material first and figure out its useful properties afterward. Take silicon, for example, which took over 40 years of tinkering and basic research to evolve from a mystery crystal at the heart of a radio into a transistor. The goal now is to move away from intuition-driven trial and error experiments. Eventually, Lookman wants to be able to tell a computer that he’s looking for a semiconductor with some specific band gap and heat capacity, and have the computer spit out the chemical formula of a material that fits the description.
To achieve this, materials scientists are turning to density functional theory (DFT), first invented in 1965. Instead of framing Schrödinger’s equation in terms of each particle’s wavefunction — those complex probability amplitudes at the root of so many undergraduate headaches — DFT rewrites Schrödinger’s equation in terms of a matrix known as the quantum density, which is much easier to compute.
But it’s still not straightforward to formulate and compute the relevant equations for a particular material. You might divide a solid into repeating patterns of hundreds of electrons. Then you have to account for the three-dimensional interactions between each of those electrons. Researchers usually deal with the high number of calculations by making approximations — and still often need a supercomputer to complete the task. "And then you hope the approximation is close to the real answer," says graduate student Thomas Baker of the University of California, Irvine.
Baker’s group, led by Kieron Burke at Irvine, demonstrated that they could use machine learning to simplify DFT calculations. They fed exact DFT results on short hydrogen chains to a computer, and through machine learning techniques, could then accurately calculate the properties of longer chains of hydrogen without time-consuming DFT computations. "With machine learning, you could run these calculations on a laptop instead of a supercomputer," Baker says.
Other groups at the meeting reported on how to translate simulations into experiments. For example, Lookman’s group simulated different nickel-titanium alloys — and worked with experimentalists to make some of their simulated compounds.
Lookman’s group searched for an alloy that would not fatigue upon repetitive heating and cooling. Such a material could be used as an actuator in an aircraft engine, he says.
They knew that the right alloy would contain some combination of nickel, titanium, copper, palladium, and iron. However, about 800,000 different combinations of these elements exist. To find their top contenders, the group taught a regular computer to search those combinations for the desired property using machine learning. They did this by showing the computer experimental measurements of 60 other nickel-titanium alloys, just like Google teaches its neural networks to recognize the image of a cat. Experimentalists were able to make 36 different alloys based on the computer’s recommendation.
But even with these success stories, computer-based approaches have their limits. Sun, of LBNL, is skeptical of how broadly useful machine learning will be, despite the high number of talks at the meeting about it. "I made a joke in my talk," he says. "I said that the only neural networks we used were the neural networks of our nine trained materials scientists."
He points out that machine learning algorithms — like Google’s image-recognition software — generally need to be trained with a lot of data. "And the reality is that materials science is not a big data problem," Sun says. "It’s kind of small data. For big data, you need 109 data points. But metals, salts, ceramics, semiconductors, everything combined — there are only 105 of them. Within a small material class there might be as few as 100 or 1000."
Two of Sun’s colleagues, Gerbrand Ceder of the University of California Berkeley and Kristin Persson of LBNL, launched the Materials Project, a database of material properties, in 2011. Using supercomputers and DFT, they have calculated thermodynamic properties for over 67,000 different inorganic compounds, many of which have never been experimentally made. Their goal is to serve as a "Google" for materials — an organized database where a researcher can easily look up the band gap, density, or conductivity of a material.
But it’s challenging to translate these calculations to the lab. Computers often predict that a compound exists, but an experimentalist won’t be able to make it, Sun says. In particular, he and his colleagues have had difficulty synthesizing certain so-called metastable materials. Diamonds, for example, are metastable. This means that even though they exist under standard temperature and pressure, they can’t form under those conditions. Sun is focusing his research efforts on understanding why experimentalists can’t make predicted compounds.
"That’s the critical bottleneck," he says.
But even so, researchers have already developed new fertilizer chemicals and battery materials using computational methods. In 2015, researchers in Germany synthesized the highest temperature superconductor to date using DFT predictions. One thing is clear: The recipe for the next generation of materials is hiding somewhere in a computer simulation.
The author, a contributor to Wired and Physics Girl, is based in Tucson, Arizona.
©1995 - 2022, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.
Editor: David Voss
Staff Science Writer: Rachel Gaal
Contributing Correspondent: Alaina G. Levine
Publication Designer and Production: Nancy Bennett-Karasik