Abstract

Live-cell microscopy is quickly becoming an indispensable technique for studying the dynamics of cellular processes. Maintaining the specimen in focus during image acquisition is crucial for high-throughput applications, especially for long experiments or when a large sample is being continuously scanned. Automated focus control methods are often expensive, imperfect, or ill-adapted to a specific application and are a bottleneck for widespread adoption of high-throughput, live-cell imaging. Here, we demonstrate a neural network approach for automatically maintaining focus during bright-field microscopy. Z-stacks of yeast cells growing in a microfluidic device were collected and used to train a convolutional neural network to classify images according to their z-position. We studied the effect on prediction accuracy of the various hyperparameters of the neural network, including downsampling, batch size, and z-bin resolution. The network was able to predict the z-position of an image with ±1 μm accuracy, outperforming human annotators. Finally, we used our neural network to control microscope focus in real-time during a 24 hour growth experiment. The method robustly maintained the correct focal position compensating for 40 μm of focal drift and was insensitive to changes in the field of view. About ~100 annotated z-stacks were required to train the network making our method quite practical for custom autofocus applications.

Highlights

  • Biologists routinely use live-cell imaging to monitor the dynamics of the cell’s state, to track real-time biochemical processes in vivo, and to read out cellular phenotypes from time-lapse microscopy images[1,2,3]

  • To infer the focal position of a yeast cell culture during live-cell microscopy, we built and trained a convolutional neural network (CNN) using z-stacks of yeast cells growing in a microfluidic device

  • We presented a convolutional neural network (CNN) approach for inferring the focal position of microbial cells under bright-field microscopy and for using the CNN output to reliably and accurately control a microscope in real-time to maintain focus

Read more

Summary

Introduction

Biologists routinely use live-cell imaging to monitor the dynamics of the cell’s state, to track real-time biochemical processes in vivo, and to read out cellular phenotypes from time-lapse microscopy images[1,2,3]. Active autofocus uses knowledge of the physical characteristics of the system to obtain defocus information, and correct the defocus [22] Electromagnetic waves, such as a laser or infrared light, are applied to maintain the distance between object of interest and lens[20]. A predefined focal reference is first determined, typically by taking a series of images at multiple z-positions on both sides of best focus of the sample. Passive autofocus is not ideal for tracking changing objects or for high-throughput imaging acquisitions

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call