Abstract

Neural networks that are based on the unfolding of iterative solvers as LISTA (Learned Iterative Soft Shrinkage), are widely used due to their accelerated performance. These networks, trained with a fixed dictionary, are inapplicable in varying model scenarios, as opposed to their flexible non-learned counterparts. We introduce, Ada-LISTA, an adaptive learned solver which receives as input both the signal and its corresponding dictionary, and learns a universal architecture to serve them all. This scheme allows solving sparse coding in linear rate, under varying models, including permutations and perturbations of the dictionary. We provide an extensive theoretical and numerical study, demonstrating the adaptation capabilities of our approach, and its application to the task of natural image inpainting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call