Abstract

ABSTRACT As cosmological simulations have grown in size, the permanent storage requirements of their particle data have also grown. Even modest simulations present a major logistical challenge for the groups which run these boxes and researchers without access to high performance computing facilities often need to restrict their analysis to lower quality data. In this paper, we present guppy, a compression algorithm and code base tailored to reduce the sizes of dark matter-only cosmological simulations by approximately an order of magnitude. guppy is a ‘lossy’ algorithm, meaning that it injects a small amount of controlled and uncorrelated noise into particle properties. We perform extensive tests on the impact that this noise has on the internal structure of dark matter haloes, and identify conservative accuracy limits which ensure that compression has no practical impact on single-snapshot halo properties, profiles, and abundances. We also release functional prototype libraries in C, Python, and Go for reading and creating guppy data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.