Abstract

From the combined perspective of biologists, microscope instrumentation developers, imaging core facility scientists, and high performance computing experts, we discuss the challenges faced when selecting imaging and analysis tools in the field of light-sheet microscopy. Our goal is to provide a contextual framework of basic computing concepts that cell and developmental biologists can refer to when mapping the peculiarities of different light-sheet data to specific existing computing environments and image analysis pipelines. We provide our perspective on efficient processes for tool selection and review current hardware and software commonly used in light-sheet image analysis, as well as discuss what ideal tools for the future may look like.

Highlights

  • Since light-sheet microscopy was introduced to the life and biomedical science communities in 1993 (Voie et al, 1993) and more broadly in 2004 (Huisken et al, 2004), there has been a virtual Cambrian explosion of light-sheet instrumentation and image analysis tools [see here for recent reviews (Reynaud et al, 2015; Albert-Smet et al, 2019; Wan et al, 2019)]

  • We have presented a high-level overview of computing concepts we find relevant to navigating the existing software available for analysis of light-sheet microscopy data

  • We have discussed the progression of light-sheet microscopy development from optical hardware to computing hardware and analysis software

Read more

Summary

INTRODUCTION

Since light-sheet microscopy was introduced to the life and biomedical science communities in 1993 (Voie et al, 1993) and more broadly in 2004 (Huisken et al, 2004), there has been a virtual Cambrian explosion of light-sheet instrumentation and image analysis tools [see here for recent reviews (Reynaud et al, 2015; Albert-Smet et al, 2019; Wan et al, 2019)]. Acquisition software for light-sheet microscopes can be incorporated with on-the-fly image pre-processing and processing steps that utilize GPU or CPU computational resources prior to data storage This approach can reduce overall analysis time significantly and runs the risk of loss of information from the raw data. Given the size of most light-sheet data, the type and structure of the file the voxels will be stored in is important to consider in advance to reduce the need to perform “data wrangling,” that is, to re-save or modify the structure of the data so it can be computed on by a given analysis software and to improve data access speeds.

GB 64 GB 96 GB 128 GB
TB SSD 16 TB SSD RAID 0
DISCUSSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call