Abstract
Nonnegative Least Squares (NNLS) is a general form for many important problems. We consider a special case of NNLS where the input is nonnegative. It is called Totally Nonnegative Least Squares (TNNLS) in the literature. We show a reduction of TNNLS to a single class Support Vector Machine (SVM), thus relating the sparsity of a TNNLS solution to the sparsity of supports in a SVM. This allows us to apply any SVM solver to the TNNLS problem. We get an order of magnitude improvement in running time by first obtaining a smaller version of our original problem with the same solution using a fast approximate SVM solver. Second, we use an exact NNLS solver to obtain the solution. We present experimental evidence that this approach improves the performance of state-of-the-art NNLS solvers by applying it to both randomly generated problems as well as to real datasets, calculating radiation therapy dosages for cancer patients.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.