- American Physical Society Sites
- Meetings & Events
- Policy & Advocacy
- Careers In Physics
- About APS
- Become a Member
By Sophia Chen
2018 APS March Meeting, Los Angeles—At the beginning of his equation-filled PowerPoint presentation, invited speaker Ehsan Khatami pulled up a picture of a puppy.
Gesturing toward the puppy’s round, doleful eyes, the San Jose State University physicist explained how he’d used artificial intelligence—the same approach that lets Google’s image recognition software distinguish between puppies and kittens—to identify phases in a quantum condensed matter model.
Simulation of exotic phases and emergent phenomena is a hot topic in condensed matter physics research, but conventional computing methods are reaching their limits: As the number of particles in a model approaches values for actual materials, simulations require supercomputing facilities. "It becomes enormously expensive," says Khatami.
In a Ising model of spins, a neural network can distinguish between a disordered state at high temperature (top) and an ordered state below the transition temperature (bottom).
Khatami and other condensed matter researchers think that artificial intelligence—which includes techniques known as machine learning and neural networks—can be used alongside conventional computing algorithms for studying collections of electrons in a material. Machine learning algorithms can run on an ordinary desktop computer beefed up with additional graphical processing units—computer chips that are popularly used to display video game graphics and are perfect for fast parallel computations.
In a machine learning algorithm, a neural network can be "trained" to recognize patterns: Show the algorithm as many photos of dogs and cats as possible, and it can learn to distinguish one animal from the other. In condensed matter physics, instead of sorting cats and dogs, you’re sorting images of known electron configurations that are above and below some critical temperature, says Khatami.
Condensed matter physicists have only begun to use these artificial intelligence techniques in the past few years. Khatami was introduced to machine learning at the March Meeting just two years ago. "It was a Friday talk, and I had nothing better to do," Khatami told APS News. "I just dropped in, and I stepped out of the room completely fascinated by the idea of neural networks."
Khatami has since used neural networks to look for phase transitions in a 64-particle version of the Hubbard model, a quantum model of charged particles in a lattice. These particles—simplistic models of electrons—have spin and can move freely in the three-dimensional lattice. While still a simple model, it is complicated enough to predict certain phase transitions and difficult to simulate without supercomputers.
Based on the Hubbard model, Khatami produces a series of images, where each pixel represents a particle, and its color represents the particle spin direction. If all the spins are aligned, "the image is completely black or completely white," he says. But Hubbard model images usually aren’t one color: Thanks to quantum noise, they ultimately resemble a poorly played game of Tetris.
Khatami trains the neural network with these images. Then he asks the algorithm to predict what the model will look like under a different set of conditions. His neural network predicts a type of spin ordering that is consistent with results found using a more traditional technique, he says.
Because the use of machine learning in condensed matter physics is so new, researchers are still doing proof-of-principle studies. "We’ve shown that we can reproduce results seen with other techniques with much less effort," says physicist Simon Trebst of the University of Cologne. Trebst, who presented an invited talk after Khatami, identified phase transitions in a system of several hundred fermions using a hybrid method that combines machine learning with a conventional numerical method, the Quantum Monte Carlo algorithm.
Machine learning could be useful in other areas of condensed matter research. Khatami thinks that experimentalists could use machine learning to look for patterns in electron microscopy or scanning tunneling microscopy images.
Trebst believes that combining machine learning with condensed matter experiments could help answer questions about the inner workings of the network itself. Right now, researchers know that the machine can find patterns, but they don’t really understand how it finds them. Research suggests that some machine learning processes work like a mathematical technique used in both particle and condensed matter physics called "renormalization." This method systematically maps a microscopic picture onto a macroscopic one. Intuitions gained in condensed matter physics could help unlock how machines learn.
Machine predictions alone will not directly confirm new physics, says Trebst. The neural network is a mathematical algorithm only as good as the data it has been given, and the process it uses to identify and extrapolate patterns is still mysterious.
Instead, machine learning predictions can help guide condensed matter experiments that probe the interactions of hundreds, thousands, even 1023 electrons. These systems are difficult to simulate with normal computational methods, yielding approximate predictions that are difficult to test. Machine learning results can be used to help build consensus among these various predictions to steer the next steps. "These problems are so hard that any guidance is needed," says Trebst.
The author is a freelance science writer in Tucson, Arizona.
©1995 - 2020, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.
Editor: David Voss
Staff Science Writer: Leah Poffenberger
Contributing Correspondent: Alaina G. Levine
Publication Designer and Production: Nancy Bennett-Karasik