Abstract
Image segmentation, an indispensable stage of digital image processing for computer vision, plays an important role in linking up of image processing, image recognition and image analysis. The blanket fractal dimension method can help segment images with irregular objects, and has been used in different fields. Morphology is an important method in processing images through using erosion operation, dilation operation, opening operation and closing operation, which help perfect incomplete edges or contours of objects to a certain extent. The traditional fractal dimension method is poor in segmenting irregular objects from a complex background because it cannot distinguish the close grayscale between the objects and the complex background besides different sizes of the objects, which are easy to cause under-segmentation. To improve the segmentation effect, an improved image segmentation method based on bit-plane and morphological reconstruction is proposed in the present work. The traditional blanket fractal dimension method is used to make a coarse segmentation of the pending images, and the coarse segmented results are further improved through using bit-plane and morphological reconstruction. The reconstruction role is to compensate the missing features and eliminate the invalid features. Based on the fine segmented results carried out in the experiments, it is found that the proposed method can obtain more accurate segmentation effect than the traditional methods in image processing.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.