Abstract

This work describes a parallelizable optical flow field estimator based upon a modified batch version of the Self-Organizing Map (SOM). This estimator handles the ill-posedness in gradient-based motion estimation via a novel combination of regression and self-organization. The aperture problem is treated using an algebraic framework that partitions motion estimates obtained from regression into two sets: one (set Hc) containing motion estimates with high confidence and another (set Hp) with low confidence. The self-organization step uses a uniquely designed pair of sets: training set (Q = Hc) and the initial weights set (\({W=H_c \cup H_p}\)). It is shown that with this specific choice of training and initial weights sets, the interpolation of flow vectors is achieved primarily due to the regularization property of the SOM. Moreover, the computationally involved step of finding the winner unit in SOM simplifies to indexing into a 2D array making the algorithm highly scalable. To preserve flow discontinuities at occlusion boundaries, we have designed an anisotropic neighborhood function for SOM that uses a novel optical flow constraint equation residual-based distance measure. A multi-resolution or pyramidal approach is used to estimate large motion. As self-organization based motion estimation is computationally intense, parallel processing on graphics processing units is used for speedup. As the algorithm is data (rather datum) parallel, with sufficient number of computing cores, the implementation of the estimator can be made real time. With the available ground truth from Middlebury database, error metrics like average angular error and average end point error are computed and are shown to be comparable with other leading techniques.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.