Artificial Intelligence

Artificial Intelligence Sheds Light on Phase Transitions


Machine learning has been used by Tokyo Metropolitan University scientists to explore spin models that are used in physics to explore phase transitions. Earlier work revealed that image/handwriting that classifies artificial intelligence (AI) could be used to differentiate states in the most basic models.

Simulated low temperature (left) and high temperature (right) phase of a 2D Ising model, where blue points are spins pointing up, and the red points are spins pointing down. Notice that the spins in the low-temperature phase are mostly in the same direction. This is called a ferromagnetic phase. On the other hand, at high temperature, the ratio of up to down spins is closer to 50:50. This is called a paramagnetic phase. Image Credit: Tokyo Metropolitan University.

The researchers demonstrated that the method is relevant to more complicated models and discovered that an AI trained on one model and applied to another could expose crucial similarities between diverse phases in various systems.

AI and machine learning are transforming the way people drive, work, live, and play. The autonomous car—the algorithm that out-performed a Go grandmaster and developments in finance—is only the tip of the iceberg of a wide variety of applications that are having a substantial impact on society.

AI is also making a huge impact on scientific research. One of the main attractions of these algorithms is how they can be taught with pre-classified data (for example, pictures of handwritten letters) and be applied to categorize a relatively broader range of data.

In the domain of condensed matter physics, the latest work by Carrasquilla and Melko (Nature Physics (2017) 13, 431-434) has revealed that the same kind of AI utilized to infer handwriting, neural networks, could be applied to differentiate various phases of matter (for example, solid, liquid, and gas) in basic physical models.

The researchers examined the Ising model, the most basic model for the evolution of magnetism in materials. An atomic lattice with a spin (up or down) has an energy that relies on the relative arrangement of neighboring spins.

Based on the conditions, these spins can organize themselves into a ferromagnetic phase (similar to iron) or take up arbitrary directions in a paramagnetic phase. Typically, analyses of this type of system involve the study of certain averaged quantity (for example, sum of all the spins). The fact that a whole microscopic configuration can be utilized to categorize a phase revealed a true paradigm shift.

Currently, a research team headed by Professors Hiroyuki Mori and Yutaka Okabe from Tokyo Metropolitan University is working together with the Bioinformatics Institute in Singapore to expand this approach. In its present form, the technique developed by Carrasquilla and Melko cannot be used on more complicated models than the Ising model. For instance, in the q-state Potts model, atoms can take one of the q states rather than just “up” or “down.”

While it also has a phase transition, it is not trivial to differentiate the phases. As a matter of fact, a 5-state model contains 120 states that are physically equal. To help an AI differentiate the phases apart, the researchers provided more microscopic information to the AI, especially how the state of a specific atom associates with the state of another atom located some distance away, or how the spins correspond over separation.

When the researchers taught the AI with several of these correlation configurations for both 3- and 5-state Potts models, they learned that it was able to accurately categorize phases and detect the temperature where the transition occurred. In addition, the researchers were able to accurately account for the number of points in their lattice, the finite-size effect.

After proving the effectiveness of their technique, the scientists attempted the same method on a q-state clock model, in which spins take one of q orientations on a circle. When q is greater than or equal to 5, the system can take three phases: a high-temperature phase, an ordered low-temperature phase, and a phase in between referred to as the Berezinskii-Kosterlitz-Thouless (BKT) phase. The analysis of the BKT phase by John M. Kosterlitz, David J. Thouless, and Duncan Haldane earned them the 2016 Nobel Prize for Physics.

The researchers progressed to effectively teach an AI to distinguish the three phases using a 6-state clock model. When they subsequently applied it to configurations from a 4-state clock model, where just two phases are anticipated to exist, they found that the algorithm could categorize the system as being in a BKT phase close to the phase transition.

This proves that a deep connection exists between the BKT phase and the critical phase emerging at the smooth “second-order” phase transition point in the 4-state system.

The technique demonstrated by the researchers is usually relevant to a host of scientific issues. A vital part of physics is universality, recognizing traits in apparently unconnected phenomena or systems that give result in unified behavior.

Machine learning is distinctively positioned to tease these features out of the most complicated systems and models, allowing researchers to have a glimpse of the deep connections that control nature and the universe.

Source: https://www.tmu.ac.jp/english/  



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.