Abstract
Truck processing time, including the processing time per lane and per truck at a specific timestamp, at marine container terminal gates is crucial for measuring the performance of terminals. Truck processing time has traditionally been collected for short periods of time (e.g., a few hours) through field observation. From the surveillance cameras widely available at terminal gates, researchers have attempted to collect truck processing time data by manually reviewing camera images frame by frame. However, this manual review process is labor-intensive and time-consuming. This study is motivated by the need for effectively collecting truck processing times over a long period of time. An image-processing algorithm to automatically extract truck processing time data using the low-frame-rate images (less than 1 frame per second, fps) is first proposed. The proposed algorithm includes three steps: (1) design of two region of interest (ROIs) per lane to capture truck trajectories, (2) a frame-differencing change-detection algorithm addressing the low frame rate and cast shadow issue, and (3) a unique state transition model with a set of decision rules, considering perspective occlusion and other potential sources of false positive detections, to reliably detect truck departures. An experimental test using one day's actual images in varying conditions was conducted to evaluate the performance of the proposed algorithm. Experimental tests have demonstrated the robustness of the proposed algorithm's ability to meet the unique technical challenges at a terminal gate, including the following: day and night conditions, cast shadows, occlusion by work vehicles, people, and nearby trucks. An experimental test using 7,225 images (6,567 day and 658 night operation images) was conducted to evaluate the performance of the proposed algorithm. Experimental tests have also demonstrated the robustness of the proposed algorithm for successfully detecting truck departures under several challenging situations, including perspective occlusion, cast shadows, nighttime and various lighting conditions, and multiple-lane departures. The correct detection rate is 98.1% for daytime images and 90.8% for nighttime images, giving our data a correct detection rate of 97.6%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.