A powerful sufficient condition for the dynamical stability of spherical stellar systems to collisionless spherical perturbations is derived within the framework of general relativity. The derivation proceeds along lines similar to those of an earlier paper devoted to the corresponding Newtonian problem. It begins with the formulation of, and solution to, the problem of determining which relativistic spherical systems maximize locally, at fixed mass-energy, rest mass, and boundary value of the phase density, the value of the phase-space integral of a chosen function of the phase density. It is shown that members of appropriate one-parameter sequences of spherical equilibria cease to maximize their associated integral-functionals at the first maxima of the binding energy along the sequences. A proof is given that a relativistic spherical stellar system in collisionless equilibrium is dynamically stable to collisionless perturbations if it does maximize the unique functional which it extremizes. This implies that members of appropriate equilibrium sequences are stable at least up to the first maximum of the binding energy. This sufficient condition is remarkably strong, because in several studied cases it breaks down at a point which coincides, to within available numerical accuracy, with the calculated point of onset of dynamical instability. Thismore » leads to the conjecture that the condition is in fact also generally necessary. If so, an implication would be that general relativity induces spherical dynamical instabilities in certain nearly Newtonian stellar systems. While conceptually important, this would probably be of limited astrophysical interest, since the instability growth times should be quite long.« less
Read full abstract