Abstract

The cost for stochastic sampling of quantum chromodynamics (QCD) vacuum configurations outweighs by far the costs of the remaining computational tasks in Lattice QCD, due to the non-local forces arising from the dynamics of fermion loops in the vacuum fluctuations. The evaluation of quality and hence efficiency of sampling algorithms is largely determined by the assessment of their decorrelation capacity along the Monte Carlo time series. In order to gain control over statistical errors, state-of-the-art research and development on QCD sampling algorithms need substantial amount of teraflops-hours. Over the past years two German–Italian collaborations, SESAM and T χL, carried out exploratory simulations, joining their resources in a meta-computing effort on various computer platforms in Italy and Germany. In this article, we shall discuss the practical aspects of this work, present highlights of autocorrelation measurements, illustrate the impact of unquenching on some fundamental parameters of QCD and describe the lessons to be learned for future, more realistic computer experiments of this kind.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.