Abstract

Given training data, convolutional dictionary learning (CDL) seeks a translation-invariant sparse representation, which is characterized by a set of convolutional kernels. However, even a small training set with moderate sample size can render the optimization process both computationally challenging and memory starving. Under a biconvex optimization strategy for CDL, we propose to diagonally precondition the system matrices in the filter learning sub-problem that can be solved by the alternating direction method of multipliers (ADMM). This method leads to the substitution of matrix inversion ([Formula: see text] and matrix multiplication ([Formula: see text] involved in ADMM with an element-wise operation ([Formula: see text], which significantly reduces the computational complexity as well as the memory requirement. Numerical experiments validate the performance advantage of the proposed method over the state-of-the-arts. Code is available at https://github.com/baopingli/Efficient-Convolutional-Dictionary-Learning-using-PADMM .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call