Abstract
Reconstructing spectral functions from Euclidean Green's functions is an important inverse problem in many-body physics. However, the inversion is proved to be ill-posed in the realistic systems with noisy Green's functions. In this paper we propose an automatic differentiation (AD) framework as a generic tool for the spectral reconstruction from propagator observable. Exploiting the neural networks' regularization as a nonlocal smoothness regulator of the spectral function, we represent spectral functions by neural networks and use the propagator's reconstruction error to optimize the network parameters unsupervisedly. In the training process, except for the positive-definite form for the spectral function, there are no other explicit physical priors embedded into the neural networks. The reconstruction performance is assessed through relative entropy and mean square error for two different network representations. Compared to the maximum entropy method, the AD framework achieves better performance in the large-noise situation. It is noted that the freedom of introducing nonlocal regularization is an inherent advantage of the present framework and may lead to substantial improvements in solving inverse problems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.