Abstract

Deep learning methods have been shown to be effective in representing ground-state wavefunctions of quantum many-body systems, however the existing approaches cannot be easily used for non-square like or large systems. Here, we propose a variational ansatz based on the graph attention network (GAT) which learns distributed latent representations and can be used on non-square lattices. The GAT-based ansatz has a computational complexity that grows linearly with the system size and can be extended to large systems naturally. Numerical results show that our method achieves the state-of-the-art results on spin-1/2 J 1–J 2 Heisenberg models over the square, honeycomb, triangular, and kagome lattices with different interaction strengths and lattice sizes (up to 24 × 24 for square lattice). The method also provides excellent results for the ground states of transverse field Ising models on square lattices. The GAT-based techniques are efficient and versatile and hold promise for studying large quantum many-body systems with exponentially sized objects.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call