Abstract

The principle thought of parallel image processing is to isolate the image processing task into straight forward assignments and process them simultaneously. As a result of the expansive information set size of high-resolution image data, most desktop workstations do not have sufficient configurable scheduling to perform image processing assignments in a convenient manner. The processing power of the regular desktop workstations in this way become a severe bottleneck during the waiting time in reviewing and enhancing high end image information. Many image processing tasks exhibit a high level of information region and parallelism and map quite readily to a parallel computing system. This paper shows an alternative to sequential image processing by introducing Map-Reduce technique to segment multiple images with the help of Hadoop framework. The evaluation of the proposed scheduling algorithm is done by implementing parallel image segmentation algorithm to detect lung tumour for up to 1 GB size of image dataset. The results have shown improved performance with parallel image segmentation when compared to sequential image segmentation method particularly when data capacity reaches a particular threshold. This is because the process of parallel image processing has been able to utilize the CPU usage with octacores up to 96%, hence reducing the task execution time up to approximately 1.6 times compared with the sequential style of image segmentation. The proposed parallel image segmentation design has shown to be useful for researchers at performing bulk image segmentation in parallel, which can save tremendous execution time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call