Abstract

We analyze the problem of supervised learning of ferromagnetic phase transitions from the statistical physics perspective. We consider two systems in two universality classes, the two-dimensional Ising model and two-dimensional Baxter-Wu model, and perform careful finite-size analysis of the results of the supervised learning of the phases of each model. We find that the variance of the neural network (NN) output function (VOF) as a function of temperature has a peak in the critical region. Qualitatively, the VOF is related to the classification rate of the NN. We find that the width of the VOF peak displays the finite-size scaling governed by the correlation length exponent ν of the universality class of the model. We check this conclusion using several NN architectures-a fully connected NN, a convolutional NN, and several members of the ResNet family-and discuss the accuracy of the extracted critical exponents ν.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.