Abstract
As it was pointed out in the Introduction, many important sparse recovery methods 3 are based on empirical risk minimization with convex loss and convex complexity 4 penalty. Some interesting algorithms, for instance, the Dantzig selector by Candes 5 and Tao [44] can be formulated as linear programs. In this chapter, we develop 6 error bounds for such algorithms that require certain geometric assumptions on 7 the dictionary. They are expressed in terms of restricted isometry constants and 8 other related characteristics that depend both on the dictionary and on the design 9 distribution. Based on these geometric characteristics, we describe the conditions of 10 exact sparse recovery in the noiseless case as well as sparsity oracle inequalities for 11 the Dantzig selector in regression problems with random noise. These results rely 12 on comparison inequalities and exponential bounds for empirical and Rademacher 13 processes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.