Abstract

Context information is significant for semantic extraction and recovery of messages in semantic communication. However, context information is not fully utilized in the existing semantic communication systems since relationships between sentences are often ignored. In this paper, we propose an extended context-based semantic communication (ECSC) system for text transmission, in which context information within and between sentences are explored for semantic representation and recovery. At the encoder, self-attention and segment-level relative attention are used to extract context information within and between sentences, respectively. In addition, a gate mechanism is adopted at the encoder to incorporate the context information from different ranges. At the decoder, Transformer-XL is introduced to obtain more semantic information from the historical communication processes for semantic recovery. Simulation results show the effectiveness of our proposed model in improving the semantic accuracy between transmitted and recovered messages under various channel conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call