Abstract
Tensor network states provide an efficient class of states that faithfully capture strongly correlated quantum models and systems in classical statistical mechanics. While tensor networks can now be seen as becoming standard tools in the description of such complex many-body systems, close to optimal variational principles based on such states are less obvious to come by. In this work, we generalize a recently proposed variational uniform matrix product state algorithm for capturing one-dimensional quantum lattices in the thermodynamic limit, to the study of regular two-dimensional tensor networks with a non-trivial unit cell. A key property of the algorithm is a computational effort that scales linearly rather than exponentially in the size of the unit cell. We demonstrate the performance of our approach on the computation of the classical partition functions of the antiferromagnetic Ising model and interacting dimers on the square lattice, as well as of a quantum doped resonating valence bond state. Tensor network states provide an efficient class of states that faithfully capture strongly correlated quantum models and systems in classical statistical mechanics. While tensor networks can now be seen as becoming standard tools in the description of such complex many-body systems, close to optimal variational principles based on such states are less obvious to come by. In this work, we generalize a recently proposed variational uniform matrix product state algorithm for capturing one-dimensional quantum lattices in the thermodynamic limit, to the study of regular two-dimensional tensor networks with a non-trivial unit cell. A key property of the algorithm is a computational effort that scales linearly rather than exponentially in the size of the unit cell. We demonstrate the performance of our approach on the computation of the classical partition functions of the antiferromagnetic Ising model and interacting dimers on the square lattice, as well as of a quantum doped resonating valence bond state.
Highlights
Tensor network methods are increasingly becoming a standard tool for studying the physics of stronglycorrelated systems, both from the perspective of a theoretical and mathematical understanding of manybody effects as well as for providing a versatile toolbox for numerical simulations[1,2,3,4,5,6]
Two-dimensional tensor networks are most naturally obtained in the context of two-dimensional statistical mechanics, where they appear as a representation of the partition function of lattice spin models with local interactions
For our third benchmark we start from the resonating valence bond (RVB) state in the square lattice, which can be represented as a translation-invariant projected entangled-pair states (PEPS) from a local tensor Asu,r,d,l with explicit SU(2) invariance [61]
Summary
Tensor network methods are increasingly becoming a standard tool for studying the physics of stronglycorrelated systems, both from the perspective of a theoretical and mathematical understanding of manybody effects as well as for providing a versatile toolbox for numerical simulations[1,2,3,4,5,6]. In particular variational MPS-tangent-space methods such as VUMPS can exploit more advanced solvers for the leading eigenvector of the transfer matrix. This property leads to a significant speed-up for the VUMPS algorithm as Accepted in Quantum 2020-08-03, click title to verify. The relevant two-dimensional tensor network cannot be chosen to be translation invariant, but rather consists of a larger unit cell of different tensors that are repeated over the infinite lattice. The tensor network itself is translation-invariant but the lattice symmetry is spontaneously broken In both cases, an algorithm with uniform tensors can not be used for the contraction. We show that this generalization of the VUMPS algorithm is, a very natural one and leads to an algorithm with a complexity that scales linearly with the size of the non-trivial unit cell
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have