This paper considers the blind deconvolution of multiple modulated signals/filters, and an arbitrary filter/signal. Multiple inputs $\boldsymbol{s}_1, \boldsymbol{s}_2, \ldots, \boldsymbol{s}_N =: [\boldsymbol{s}_n]$ are modulated (pointwise multiplied) with random sign sequences $\boldsymbol{r}_1, \boldsymbol{r}_2, \ldots, \boldsymbol{r}_N =: [\boldsymbol{r}_n]$ , respectively, and the resultant inputs $(\boldsymbol{s}_n \odot \boldsymbol{r}_n) \in \mathbb {C}^Q, n \in [N]$ are convolved against an arbitrary input $\boldsymbol{h} \in \mathbb {C}^M$ to yield the measurements $\boldsymbol{y}_n = (\boldsymbol{s}_n\odot \boldsymbol{r}_n)\circledast \boldsymbol{h}, n \in [N] := 1,2,\ldots,N,$ where $\odot$ and $\circledast$ denote pointwise multiplication, and circular convolution. Given $[\boldsymbol{y}_n]$ , we want to recover the unknowns $[\boldsymbol{s}_n]$ and $\boldsymbol{h}$ . We make a structural assumption that unknowns $[\boldsymbol{s}_n]$ are members of a known $K$ -dimensional (not necessarily random) subspace, and prove that the unknowns can be recovered from sufficiently many observations using a regularized gradient descent algorithm whenever the modulated inputs $\boldsymbol{s}_n \odot \boldsymbol{r}_n$ are long enough, i.e, $Q \gtrsim KN+M$ (to within logarithmic factors, and signal dispersion/coherence parameters). Under the bilinear model, this is the first result on multichannel ( $N\geq 1$ ) blind deconvolution with provable recovery guarantees under near optimal (in the $N=1$ case) sample complexity estimates, and comparatively lenient structural assumptions on the convolved inputs. A neat conclusion of this result is that modulation of a bandlimited signal protects it against an unknown convolutive distortion. We discuss the applications of this result in passive imaging, wireless communication in unknown environment, and image deblurring. A thorough numerical investigation of the theoretical results is also presented using phase transitions, image deblurring experiments, and noise stability plots.