Abstract

Developing universal and context-invariant methods is one of the hardest challenges in computer vision. Background subtraction (BS), an essential precursor in most machine vision applications used for foreground detection, is no exception. Due to overreliance on statistical observations, most BS techniques show unpredictable behavior in dynamic unconstrained scenarios in which the characteristics of the operating environment are either unknown or change drastically. To achieve superior foreground detection quality across unconstrained scenarios, we propose a new technique, called perception-inspired background subtraction (PBS), which avoids overreliance on statistical observations by making key modeling decisions based on the characteristics of human visual perception. PBS exploits the human perception-inspired confidence interval to associate an observed intensity value with another intensity value during both model learning and background-foreground classification. The concept of perception-inspired confidence interval is also used for identifying redundant samples, thus ensuring the optimal number of samples in the background model. Furthermore, PBS dynamically varies the model adaptation speed (learning rate) at pixel level based on observed scene dynamics to ensure faster adaptation of changed background regions, as well as longer retention of stationary foregrounds. Extensive experimental evaluations on a wide range of benchmark datasets validate the efficacy of PBS compared to the state of the art for unconstraint video analytics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.