Abstract

Tensor networks (TN) are approximations of high-dimensional tensors designed to represent locally entangled quantum many-body systems efficiently. This paper provides a comprehensive comparison between classical TNs and TN-inspired quantum circuits in the context of machine learning on highly complex, simulated Large Hadron Collider data. We show that classical TNs require exponentially large bond dimensions and higher Hilbert-space mapping to perform comparably to their quantum counterparts. While such an expansion in the dimensionality allows better performance, we observe that, with increased dimensionality, classical TNs lead to a highly flat loss landscape, rendering the usage of gradient-based optimization methods highly challenging. Furthermore, by employing quantitative metrics, such as the Fisher information and effective dimensions, we show that classical TNs require a more extensive training sample to represent the data as efficiently as TN-inspired quantum circuits. We also engage with the idea of hybrid classical-quantum TNs and show possible architectures to employ a larger phase space from the data. We offer our results using three main TN Ans\"atze: Tree tensor networks, matrix product states, and multiscale entanglement renormalization Ans\"atze.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call