Abstract

This paper is concerned with the long-time dynamics of the nonlinear wave equation in one-space dimension, $$ u_{tt} - \delta^2 u_{xx} +V'(u) =0 \qquad x\in [0,1] $$ where $\delta>0$ is a parameter and $V(u)$ is a potential bounded from below and growing at least like $u^2$ as $|u|\to\infty$. Infinite energy solutions of this equation preserve a natural Gibbsian invariant measure and when the potential is double-welled, for example when $V(u) = \tfrac14(1-u^2)^2$, there is a regime such that two small disjoint sets in the system's phase-space concentrate most of the mass of this measure. This suggests that the solutions to the nonlinear wave equation can be metastable over these sets, in the sense that they spend long periods of time in these sets and only rarely transition between them. Here we quantify this phenomenon by calculating exactly via Transition State Theory (TST) the mean frequency at which the solutions of the nonlinear wave equation with initial conditions drawn from its invariant measure cross a dividing surface lying in between the metastable sets. Numerical results suggest that the dynamics of the nonlinear wave equation is ergodic and rapidly mixing with respect to the Gibbs invariant measure when the parameter $\delta$ in small enough. This is a regime in which the dynamics of the nonlinear wave equation displays a metastable behavior that is not fundamentally different from that observed in its stochastic counterpart in which random noise and damping terms are added to the equation. For larger $\delta$, however, the dynamics either stops being ergodic, or its mixing time becomes larger than the inverse of the TST frequency, indicating that successive transitions between the metastable sets are correlated and the coarse-graining to a Markov chain fails.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call