Abstract

We often encounter hysteresis associated with short-term memory stored in our brain when we see continuously varying pictures in back-and-forth directions. We expect a neural network (NN) to also experience hysteresis in recognizing information back and forth depending on its continuity. This study shows that using equilibrium configurations obtained from the well-defined models undergoing the phase transitions as learning data, the weights of an NN can exhibit hysteresis behaviors against sequential learning, increasing (or decreasing) an external field or temperature. Indeed, the weights’ hysteresis clearly shows up when an NN learns back and forth a series of configurations passing through a continuous transition or a crossover, whereas no hysteresis arises for discontinuous transitions. This fascinating finding opens up a new application of machine learning to judge the order of a phase transition from the suitably visualized changes in microscopic configurations without setting a specific model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call