This study introduces a novel Cross-Language Information Retrieval (CLIR) method employing multi-task learning and soft parameter sharing to enhance neural retrieval models' feature extraction across languages. The approach integrates an interaction-based neural retrieval model with a semantic-based text classification model, exchanging hidden vectors for richer feature representation. Experimental results across four language pairs—English-Chinese, English-Arabic, English-French, and English-German—demonstrate significant performance improvements. The proposed method achieved the highest Mean Average Precision (MAP) scores: 0.419 for EN-ZH, 0.403 for EN-AR, 0.427 for EN-FR, and 0.441 for EN-DE, surpassing other models like BM25, BPNRM, KNRM, KNRM-Trans, and KNRM-Embed. This research underscores the potential of multi-task learning for CLIR, showcasing improved retrieval performance through semantic information and knowledge transfer.