We consider the problem of estimating a pdf $f$ from samples $X_1, X_2, \ldots, X_n$ of a random variable with pdf $\mathscr{K}f$, where $\mathscr{K}$ is a compact integral operator. We employ a maximum smoothed likelihood formalism inspired by a nonlinearly smoothed version of the EMS algorithm of Silverman, Jones, Wilson and Nychka. We show that this nonlinearly smoothed algorithm is itself an EM algorithm, which helps explain the strong convergence properties of the algorithm. For the case of (standard) density estimation, that is, the case where $\mathscr{K}$ is the identity, the method yields the standard kernel density estimators. The maximum smoothed likelihood density estimation technique is a regularization technique. We prove an inequality which implies the stability and convergence of the regularization method for the large sample asymptotic problem. Under minimal assumptions it also implies the a.s. convergence of the finite sample density estimate via a uniform version of the strong law of large numbers. Under extra regularity conditions we get a.s. convergence rates via a uniform version of the law of the iterated logarithm (under stronger conditions than usual).