Skin cancer images have hair occlusion problems, which greatly affects the accuracy of diagnosis and classification. Current dermoscopic hair removal methods use segmentation networks to locate hairs, and then uses repair networks to perform image repair. However, it is difficult to segment hair and capture the overall structure between hairs because of the hair being thin, unclear, and similar in color to the entire image. When conducting image restoration tasks, the only available images are those obstructed by hair, and there is no corresponding ground truth (supervised data) of the same scene without hair obstruction. In addition, the texture information and structural information used in existing repair methods are often insufficient, which leads to poor results in skin cancer image repair. To address these challenges, we propose the intersection-union dual-stream cross-attention Lova-SwinUnet (IUDC-LS). Firstly, we propose the Lova-SwinUnet module, which embeds Lovasz loss function into Swin-Unet, enabling the network to better capture features of various scales, thus obtaining better hair mask segmentation results. Secondly, we design the intersection-union (IU) module, which takes the mask results obtained in the previous step for pairwise intersection or union, and then overlays the results on the skin cancer image without hair to generate the labeled training data. This turns the unsupervised image repair task into the supervised one. Finally, we propose the dual-stream cross-attention (DC) module, which makes texture information and structure information interact with each other, and then uses cross-attention to make the network pay attention to the more important texture information and structure information in the fusion process of texture information and structure information, so as to improve the effect of image repair. The experimental results show that the PSNR index and SSIM index of the proposed method are increased by 5.4875 and 0.0401 compared with the other common methods. Experimental results unequivocally demonstrate the effectiveness of our approach, which serves as a potent tool for skin cancer detection, significantly surpassing the performance of comparable methods.