Document-level relation extraction aims to uncover relations between entities by harnessing the intricate information spread throughout a document. Previous research involved constructing discrete syntactic matrices to capture syntactic relationships within documents. However, these methods are significantly influenced by dependency parsing errors, leaving much of the latent syntactic information untapped. Moreover, prior research has mainly focused on modeling two-hop reasoning between entity pairs, which has limited applicability in scenarios requiring multi-hop reasoning. To tackle these challenges, a syntax-enhanced multi-hop reasoning network (SEMHRN) is proposed. Specifically, the approach begins by using a dependency probability matrix that incorporates richer grammatical information instead of a sparse syntactic parsing matrix to build the syntactic graph. This effectively reduces syntactic parsing errors and enhances the model’s robustness. To fully leverage dependency information, dependency-type-aware attention is introduced to refine edge weights based on connecting edge types. Additionally, a part-of-speech prediction task is included to regularize word embeddings. Unrelated entity pairs can disrupt the model’s focus, reducing its efficiency. To concentrate the model’s attention on related entity pairs, these related pairs are extracted, and a multi-hop reasoning graph attention network is employed to capture the multi-hop dependencies among them. Experimental results on three public document-level relation extraction datasets validate that SEMHRN achieves a competitive F1 score compared to the current state-of-the-art methods.