With the global population surpassing 8 billion, waste production has skyrocketed, leading to increased pollution that adversely affects both terrestrial and marine ecosystems. Public littering, a significant contributor to this pollution, poses severe threats to marine life due to plastic debris, which can inflict substantial ecological harm. Additionally, this pollution jeopardizes human health through contaminated food and water sources. Given the annual global plastic consumption of approximately 475 million tons and the pervasive issue of public littering, addressing this challenge has become critically urgent. The Surveillance and Waste Notification (SAWN) system presents an innovative solution to combat public littering. Leveraging surveillance cameras and advanced computer vision technology, SAWN aims to identify and reduce instances of littering. Our study explores the use of the MoViNet video classification model to detect littering activities by vehicles and pedestrians, alongside the YOLOv8 object detection model to identify individuals responsible through facial recognition and license plate detection. Collecting appropriate data for littering detection presented significant challenges due to its unavailability. Consequently, project members simulated real-life littering scenarios to gather the required data. This dataset was then used to train different models, including LRCN, CNN-RNN, and MoViNets. After extensive testing, MoViNets demonstrated the most promising results. Through a series of experiments, we progressively improved the model’s performance, achieving accuracy rates of 93.42% in the first experiment, 95.53% in the second, and ultimately reaching 99.5% in the third experiment. To detect violators’ identities, we employed YOLOv8, trained on the KSA vehicle plate dataset, achieving 99.5% accuracy. For face detection, we utilized the Haar Cascade from the OpenCV library, known for its real-time performance. Our findings will be used to further enhance littering behavior detection in future developments.
Read full abstract