Abstract
This study examines an adaptive log-density estimation method with an $$\ell _1$$ -type penalty. The proposed estimator is guaranteed to be a valid density in the sense that it is positive and integrates to one. The smoothness of the estimator is controlled in a data-adaptive way via $$\ell _1$$ penalization. The advantages of the penalized log-density estimator are discussed with an emphasis on wavelet estimators. Theoretical properties of the estimator are studied when the quality of fit is measured by the Kullback–Leibler divergence (relative entropy). A nonasymptotic oracle inequality is obtained assuming a near orthogonality condition on the given dictionary. Based on the oracle inequality, selection consistency and minimax adaptivity are proved under some regularity conditions. The proposed method is implemented with a coordinate descent algorithm. Numerical illustrations based on the periodized Meyer wavelets are performed to demonstrate the finite sample performance of the proposed estimator.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have