Abstract

A statistical relaxation phenomenon is studied for a general class of dispersive wave equations of nonlinear Schrodinger-type which govern non-integrable, non-singular dynamics. In a bounded domain the solutions of these equations have been shown numerically to tend in the long-time limit toward a Gibbsian statistical equilibrium state consisting of a ground-state solitary wave on the large scales and Gaussian fluctuations on the small scales. The main result of the paper is a large deviation principle that expresses this concentration phenomenon precisely in the relevant continuum limit. The large deviation principle pertains to a process governed by a Gibbs ensemble that is canonical in energy and microcanonical in particle number. Some supporting Monte-Carlo simulations of these ensembles are also included to show the dependence of the concentration phenomenon on the properties of the dispersive wave equation, especially the high frequency growth of the dispersion relation. The large deviation principle for the process governed by the Gibbs ensemble is based on a large deviation principle for Gaussian processes, for which two independent proofs are given.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call