Popularity bias is commonly observed in recommendation results. Directly fitting biased data can significantly affect the quality of recommendations for long-tail items. To eliminate popularity bias, we propose a novel recommendation method called CrossGCL. Unlike existing Bayesian Personalized Ranking (BPR) based unbiased recommendation methods, CrossGCL employs Cross Pairwise Ranking (CPR) as its primary task, in an attempt to accurately predict user preferences and promote niche items. Furthermore, to better utilize unsupervised data for debiasing, CrossGCL introduces an innovative layer-wise graph contrastive learning approach based on odd-numbered and even-numbered layers and incorporates noise augmentation to construct pretext tasks. Then CrossGCL jointly trains pretext task with primary task for Top-K recommendation. Experimental results on MovieLens-10M, Netflix, and Alibaba-iFashion demonstrate that, compared to baselines, CrossGCL effectively reduces the proportion of popular items in recommendation lists while improving recommendation accuracy. The code is available at https://github.com/RYoto116/CrossGCL.