Abstract

Electron tomography (ET) has emerged as a powerful technique to address fundamental questions in molecular and cellular biology. It makes possible visualization of the molecular architecture of complex viruses, organelles and cells at a resolution of a few nanometres. In the last decade ET has allowed major breakthroughs that have provided exciting insights into a wide range of biological processes. In ET the biological sample is imaged with an electron microscope, and a series of images is taken from the sample at different views. Prior to imaging, the sample has to be specially prepared to withstand the conditions within the microscope. Subsequently, those images are processed and combined to yield the three-dimensional reconstruction or tomogram. Afterwards, a number of computational steps are necessary to facilitate the interpretation of the tomogram, such as noise reduction, segmentation and analysis of subvolumes. As the computational demands are huge in some of the stages, high performance computing (HPC) techniques are used to make the problem affordable in reasonable time. This article intends to comprehensively review the methods, technologies and tools involved in the different computational stages behind structural studies by ET, from image acquisition to interpretation of tomograms. The HPC techniques usually employed to cope with the computational demands are also briefly described.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call