Abstract

Biomedical image registration, or geometric alignment of twodimensional and /or three-dimensional (3-D) image data, is becoming increasingly important in diagnosis, treatment planning, functional studies, and computer-guided therapies and in biomedical research [1]. Registration is an important problem and a fundamental task in image processing technique. In the medical image processing fields, some techniques are proposed to find a geometrical transformation that relates the points of an image to their corresponding points of another image. In recent years, multimodality image registration techniques are proposed in the medical imaging field. Especially, CT and MR imaging of the head for diagnosis and surgical planning indicates that physicians and surgeons gain important information from these modalities. In radiotherapy planning manual registration techniques performed on MR image and CT images of the brain. Now-adays, physicians segment the volume of interest (VOIs) from each set of slices manually. However, manual segmentation of the object area may require several hours for analysis. Furthermore, MDCT images and MR images contain more than 100 slices. Therefore, manual segmentation and registration method cannot apply for clinical application in the head CT and MR images. Many automatic and semiautomatic image registration methods have been proposed [2]. The main techniques of image registration are performed by the manual operation, using Landmark and using voxel information. In this paper, an automatic intensity based registration of head images by computer has been employed by applying maximization of mutual information. The primary objective of this paper is to increase accuracy of the registration and reduce the processing time. Experiments show our algorithm is a robust and efficient method which can yield accurate registration results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.