Abstract

AbstractDue to its lack of performance and flexibility, the traditional frame difference method does not handle precise moving object detection by motion of the object region in each frame of the various video sequences. So, the object cannot be seen in the foreground accurately. It remains a serious concern. The time computation for object detection using the three frame difference and five frame difference approaches is longer, and frame information is lost. To address these flaws, a new method known as the inter frame difference method is suggested (MIFD). It detects moving objects in video under various environmental conditions with little data loss in a short period of time. MIFD involves constructing a reference frame, computing inter frame difference, a motion frame and detecting moving object(s) in a frame by drawing a rectangle blobs using connected components in the video sequence. The performance of the proposed algorithm is compared with the previous results of the code book model (CB), self-organizing background subtraction method (SOBS), local binary pattern histogram (LBPH), robust background subtraction for network surveillance in H.264, GMM, VIBE, frame difference, three frame difference, improved three frame difference, and combined three frame difference & background subtraction model. The experimental results demonstrate that the proposed methodology performance is better than the other methods in accurately detecting moving object(s) in video under challenging environmental conditions.KeywordsVideo surveillanceModified inter frame difference methodMotion frameObject detectionConnected components

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.