Fully-connected tensor network (FCTN) has recently drawn lots of attention in tensor completion due to its full description of all correlations between any two modes. However, the FCTN model has multiple ranks, and existing methods often ignore the difficulty brought about by rank selection, especially when the model is rank-sensitive. To overcome this drawback, this paper proposed a low-rank sparse FCTN for tensor completion. Specifically, we theoretically show that, for a tensor with FCTN structure, its subtensor can be represented as a coefficient sum of basic tensors, where the coefficients are remained in the FCTN’s factors. This means that factors’ sparsity can improve the robustness of FCTN to larger rank selection. Moreover, according to the relation between the target tensor and its FCTN’s factors, low-rank constrain is used to enhance rank robustness. Lastly, we optimize the proposed model by the alternating direction method of multipliers (ADMM) algorithm. Experimental results show that the low-rank sparse constraint effectively improves the rank robustness of the FCTN model, and achieves excellent results compared with other state-of-the-art completion methods.