Abstract

In this paper, we consider the multichannel blind deconvolution problem, where we observe the output of channels $\mathbf {h}_{i}\in \mathbb {R}^n$ ( $i=1,...,N$ ) that all convolve with the same unknown input signal $\mathbf {x}\in \mathbb {R}^n$ . We wish to estimate the input signal and blur kernels simultaneously. Existing theoretical results showed that the original inputs are identifiable under subspace assumptions. However, the subspaces discussed before were randomly or generically chosen. Here, we propose deterministic subspace assumption, which is widely used in practice, and give some theoretical results. First of all, we derive tight sufficient condition for identifiability of signal and convolution kernels, which is only violated on a set of Lebesgue measure zero. Then, we present a nonconvex regularization algorithm by a lifting method and approximate the rank-one constraint via the difference of nuclear norm and Frobenius norm. The global minimizer of the proposed nonconvex algorithm is rank-one matrix under mild conditions on parameters and noise level. The stability result is also shown under the assumption that the inputs lie in a compact set. Besides, the computation of our regularization model is carried out and any limit point of iterations converges to a stationary point of our model. Finally, we provide numerical experiments to show that our nonconvex regularization model outperforms convex relaxation models, such as nuclear norm minimization and some nonconvex methods, such as alternating minimization method and spectral method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call