Abstract

An increasing number of area detectors, in use at Diamond Light Source, produce high rates of data. In order to capture, store and process this data High Performance Computing (HPC) systems have been implemented. This paper will present the architecture and usage for handling high rate data: detector data capture, large volume storage and parallel processing.The EPICS area Detector frame work has been adopted to abstract the detectors for common tasks including live processing, file format and storage. The chosen data format is HDF5 which provides multidimensional data storage and NeXuS compatibility.The storage system and related computing infrastructure include: a centralised Lustre based parallel file system, a dedicated network and a HPC cluster. A well defined roadmap is in place for the evolution of this to meet demand as the requirements and technology advances.For processing the science data the HPC cluster allow efficient parallel computing, on a mixture of ×86 and GPU processing units. The nature of the Lustre storage system in combination with the parallel HDF5 library allow efficient disk I/O during computation jobs. Software developments, which include utilising optimised parallel file reading for a variety of post processing techniques, are being developed in collaboration as part of the Pan-Data EU Project (www.pan-data.eu). These are particularly applicable to tomographic reconstruction and processing of non crystalline diffraction data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call