Abstract

Accumulating evidence shows that the cerebral cortex is operating near a critical state featured by power-law size distribution of neural avalanche activities, yet evidence of this critical state in artificial neural networks mimicking the cerebral cortex is still lacking. Here we design an artificial neural network of coupled phase oscillators and, by the technique of reservoir computing in machine learning, train it for predicting chaos. It is found that when the machine is properly trained, oscillators in the reservoir are synchronized into clusters whose sizes follow a power-law distribution. This feature, however, is absent when the machine is poorly trained. Additionally, it is found that despite the synchronization degree of the original network, once properly trained, the reservoir network is always developed to the same critical state, exemplifying the "attractor" nature of this state in machine learning. The generality of the results is verified in different reservoir models and by different target systems, and it is found that the scaling exponent of the distribution is independent of the reservoir details and the bifurcation parameters of the target system, but is modified when the dynamics of the target system is changed to a different type. The findings shed light on the nature of machine learning, and are helpful to the design of high-performance machines in physical systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call