Abstract

We introduce a methodology for seeking conservation laws within a Hamiltonian dynamical system, which we term "neural deflation." Inspired by deflation methods for steady states of dynamical systems, we propose to iteratively train a number of neural networks to minimize a regularized loss function accounting for the necessity of conserved quantities to be in involution and enforcing functional independence thereof consistently in the infinite-sample limit. The method is applied to a series of integrable and nonintegrable lattice differential-difference equations. In the former, the predicted number of conservation laws extensively grows with the number of degrees of freedom, while for the latter, it generically stops at a threshold related to the number of conserved quantities in the system. This data-driven tool could prove valuable in assessing a model's conserved quantities and its potential integrability.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call