A new iterative algorithm for the solution of minimization problems in infinite-dimensional Hilbert spaces which involve sparsity constraints in form of $\ell^{p}$-penalties is proposed. In contrast to the well-known algorithm considered by Daubechies, Defrise, and De Mol, it uses hard instead of soft shrinkage. It is shown that the hard shrinkage algorithm is a special case of the generalized conditional gradient method. Convergence properties of the generalized conditional gradient method with quadratic discrepancy term are analyzed. This leads to strong convergence of the iterates with convergence rates $\mathcal{O}(n^{-1/2})$ and $\mathcal{O}(\lambda^n)$ for $p=1$ and $1 < p \leq 2$, respectively. Numerical experiments on image deblurring, backwards heat conduction, and inverse integration are given.