In natural language processing, document-level relation extraction is a complex task that aims to predict the relationships among entities by capturing contextual interactions from an unstructured document. Existing graph- and transformer-based models capture long-range relational facts across sentences. However, they still cannot fully exploit the semantic information from multiple interactive sentences, resulting in the exclusion of influential sentences for related entities. To address this problem, a novel Semantic-guided Attention and Adaptively Gated (SAAG) model is developed for document-level relation extraction. First, a semantic-guided attention module is designed to guide sentence representation by assigning different attention scores to different words. The multihead attention mechanism is then used to capture the attention of different subspaces further to generate a document context representation. Finally, the SAAG model exploits the semantic information by leveraging a gating mechanism that can dynamically distinguish between local and global contexts. The experimental results demonstrate that the SAAG model outperforms previous models on two public datasets.