Abstract

In this paper, we propose a novel method for moving foreground object extraction in sequences taken by a wearable camera, with strong motion. We use camera motion compensated frame differencing, enhanced with a novel kernel-based estimation of the probability density function of background pixels. The probability density functions are used for filtering false foreground pixels on the motion compensated difference frame. The estimation is based on a limited number of measurements; therefore, we introduce a special, spatio-temporal sample point selection and an adaptive thresholding method to deal with this challenge. Foreground objects are built with the DBSCAN algorithm from detected foreground pixels.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call