Abstract

Abstract The amazing growth of digital imaging in the past several years has blurred the line between "real" images and those that are digitally enhanced. In most of our movies and print advertisements, we see amazing effects that create images that look real but are not. For the last several decades film has been the predominant method that scientists employ to record the images viewed through their microscopes. Film has been readily accepted as a valid, archival recording medium because it was difficult to alter once the exposure was made and the film or print developed. The improvements in computer power and image resolution, coupled with environmental considerations, have spurred the scientific community to replace photographic processes with digital images. Processes that took hours and days are now performed in minutes. Imaging programs like Adobe Photoshop can perform all of the same photographic steps that required a darkroom, and then can do much more. The image can be “burned” onto a CD or DVD disk which is difficult to alter. Printer technology has advanced so rapidly that inkjet prints rival photographic prints. Images are often distributed digitally and viewed on displays that continue to improve. As the hardware for digital imaging improves, the quality of digital images is approaching photographic quality for a fraction of the cost, and publication quality images are produced in a fraction of the time required for film-based photography.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call