In this note we consider the stability of posterior measures occurring in Bayesian inference w.r.t. perturbations of the prior measure and the log-likelihood function. This extends the well-posedness analysis of Bayesian inverse problems. In particular, we prove a general local Lipschitz continuous dependence of the posterior on the prior and the log-likelihood w.r.t. various common distances of probability measures. These include the total variation, Hellinger, and Wasserstein distance and the Kullback–Leibler divergence. We only assume the boundedness of the likelihoods and measure their perturbations in an Lp-norm w.r.t. the prior. The obtained stability yields under mild assumptions the well-posedness of Bayesian inverse problems, in particular, a well-posedness w.r.t. the Wasserstein distance. Moreover, our results indicate an increasing sensitivity of Bayesian inference as the posterior becomes more concentrated, for example due to more or more accurate data. This confirms and extends previous observations made in the sensitivity analysis of Bayesian inference.