Anomaly detection in a multivariate time series using unsupervised methods presents a formidable challenge. The existing strategies focused on delineating intrinsic patterns over a temporal dimension and outputting a better representation of the input series. Generally, anomaly detecting model derives a criterion, which is mostly the distance between the representations and the original data, for distinguishing anomalies. Anomaly Transformer is a recent model that achieves substantial advancements by introducing its proprietary criterion of association discrepancy. This new criterion leverages the fact that time points focus on their adjacent neighbors if normal; however, they lose this characteristic if they are anomalous. Anomaly Transformer focuses only on the association between time points and neglects the association between the sensors. In addition, the association discrepancy calculated by the attention score between time points and Gaussian distribution can sometimes be insufficient. To achieve better performance, we propose a spatial information enhanced transformer (SiET), which is a novel paradigm that integrates graph neural networks into the original Anomaly Transformer framework, engendering a high-dimensional association discrepancy. Theoretical proofs are provided to verify the reliability of our model. Further, SiET achieved results comparable to the state-of-the-art results on five unsupervised time-series anomaly detection benchmarks.