This study presents a Bayesian maximum a posteriori (MAP) framework for dynamical system identification from time-series data. This is shown to be equivalent to a generalized Tikhonov regularization, providing a rational justification for the choice of the residual and regularization terms, respectively, from the negative logarithms of the likelihood and prior distributions. In addition to the estimation of model coefficients, the Bayesian interpretation gives access to the full apparatus for Bayesian inference, including the ranking of models, the quantification of model uncertainties, and the estimation of unknown (nuisance) hyperparameters. Two Bayesian algorithms, joint MAP and variational Bayesian approximation, are compared to the least absolute shrinkage and selection operator (LASSO), ridge regression, and the sparse identification of nonlinear dynamics (SINDy) algorithms for sparse regression by application to several dynamical systems with added Gaussian or Laplace noise. For multivariate Gaussian likelihood and prior distributions, the Bayesian formulation gives Gaussian posterior and evidence distributions, in which the numerator terms can be expressed in terms of the Mahalanobis distance or "Gaussian norm" ||y-y^||M-12=(y-y^)⊤M-1(y-y^), where y is a vector variable, y^ is its estimator, and M is the covariance matrix. The posterior Gaussian norm is shown to provide a robust metric for quantitative model selection for the different systems and noise models examined.