Abstract

Identifying Unstriped Bunches (USB) is a pivotal challenge in palm oil production, contributing to reduced mill efficiency. Existing manual detection methods are proven time-consuming and prone to inaccuracies. Therefore, we propose an innovative solution harnessing computer vision technology. Specifically, we leverage the Faster R-CNN (Region-based Convolution Neural Network), a robust object detection algorithm, and complement it with Progressive Growing Generative Adversarial Networks (PGGAN) for synthetic image generation. Nevertheless, a scarcity of authentic USB images may hinder the application of Faster R-CNN. Herein, PGGAN is assumed to be pivotal in generating synthetic images of Empty Fruit Bunches (EFB) and USB. Our approach pairs synthetic images with authentic ones to train the Faster R-CNN. The VGG16 feature generator serves as the architectural backbone, fostering enhanced learning. According to our experimental results, USB detectors that were trained solely with authentic images resulted in an accuracy of 77.1%, which highlights the potential of this methodology. However, employing solely synthetic images leads to a slightly reduced accuracy of 75.3%. Strikingly, the fusion of authentic and synthetic images in a balanced ratio of 1:1 fuels a remarkable accuracy surge to 87.9%, signifying a 10.1% improvement. This innovative amalgamation underscores the potential of synthetic data augmentation in refining detection systems. By amalgamating authentic and synthetic data, we unlock a novel dimension of accuracy in USB detection, which was previously unattainable. This contribution holds significant implications for the industry, ensuring further exploration into advanced data synthesis techniques and refining detection models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call