Abstract

We have developed a new algorithm for extracting the line-of-sight velocity distributions (LOSVDs) of galaxies from the broadening of absorption lines in their spectra relative to those of a stellar ‘template’. This method models the LOSVD as the sum of a set of Gaussian distributions uniformly spaced in velocity. By choosing the dispersion,|${\Delta}_{\nu}$|⁠, of these Gaussians so that the separate components are unresolved according to the Rayleigh criterion, such a sum can model any LOSVD that is smooth on scales smaller than |${\Delta}_{\nu}$|⁠. The velocity scale on which the LOSVD is forced to be smooth is thus set explicitly by the user. The algorithm then involves solving for the amplitudes of the individual components to produce the LOSVD that, when convolved with the stellar template, best reproduces the observed galaxy spectrum in a least-squares sense. This procedure is an example of quadratic programming for which efficient algorithms exist, and, since all of the convolutions in the analysis involve similar Gaussians, the computational expense in calculating the model is also small. The physical constraint that the LOSVD must be non-negative can be readily included in the quadratic programming, and the method allows an explicit treatment of the errors in the derived LOSVD. Monte Carlo tests show that this algorithm can, indeed, efficiently extract a wide range of LOSVDs from absorption- line spectra with attainable signal-to-noise ratios. Application to the early-type disc galaxy UGC 12591 illustrates the potential of this approach for constraining dynamical models of galaxies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call