Abstract

Lyric rewriting involves taking the original lyrics of a song and creatively rephrasing them while preserving their core meaning and emotional essence. Sequence-to-sequence methods often face the problem of lack of annotated corpus and difficulty in understanding lyrics when dealing with the lyrics rewriting task. Inspired by the language rewriting technique - grammatical error correction (GEC) and sequence-to-sequence generation technique - neural machine translation (NMT) methods, we propose novel self-supervised learning methods that can effectively solve the problem of the lack of lyrics rewriting corpus. In addition, we also propose a new pre-trained DAE Transformer model with data prior knowledge fusion to enhance the lyrics rewriting ability. The reference-as-context model (RaC-Large) constructed by us based on these two methods achieves the best results in comparison with the baseline including large language models, fully verifying the effectiveness of the new method. We also validate the effectiveness of our approach on GEC and NMT tasks, further demonstrating the potential of our approach on a broad range of sequence-to-sequence tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.