In this paper, we consider the optimization problems with 0/1-loss and sparsity constraints (0/1-LSCO) that involve two blocks of variables. First, we define a [Formula: see text]-stationary point of 0/1-LSCO, according to which we analyze the first-order necessary and sufficient optimality conditions. Based on these results, we then develop a gradient descent Newton pursuit algorithm (GDNP), and analyze its global and locally quadratic convergence under standard assumptions. Finally, numerical experiments on 1-bit compressed sensing demonstrate its superior performance in terms of a high degree of accuracy.