Abstract
Consider the matrix problem Ax = y + ε = y ̃ in the case where A is known precisely, the problem is ill conditioned, and ε is a random noise vector. Compute regularized “ridge” estimates, x ̃ λ = (A ∗A + λI) -1 A ∗ y ̃ ,where ∗ denotes matrix transpose. Of great concern is the determination of the value of λ for which x̃ λ “best” approximates x 0 = A + y . Let Q = ‖ x ̃ λ − x 0 ‖ 2 ,and define λ 0 to be the value of λ for which Q is a minimum. We look for λ 0 among solutions of dQ/ dλ = 0. Though Q is not computable (since ε is unknown), we can use this approach to study the behavior of λ 0 as a function of y and ε. Theorems involving “noise to signal ratios” determine when λ 0 exists and define the cases λ 0 > 0 and λ 0 = ∞. Estimates for λ 0 and the minimum square error Q 0 = Q(λ 0) are derived.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.