Abstract
I present a new focal-plane analog very-large-scale-integrated (aVLSI) sensor that estimates optical flow in two visual dimensions. Its computational architecture consists of a two-layer network of locally connected motion units that collectively estimate the optimal optical flow field. The applied gradient-based optical flow model assumes visual motion to be translational and smooth, and is formulated as a convex optimization problem. The model also guarantees that the estimation problem is well-posed regardless of the visual input by imposing a bias towards a preferred motion under ambiguous or noisy visual conditions. Model parameters can be globally adjusted, leading to a rich output behavior. Varying the smoothness strength, for example, can provide a continuous spectrum of motion estimates, ranging from normal to global optical flow. The non-linear network conductances improve the resulting optical flow estimate because they reduce spatial smoothing across large velocity differences and minimize the bias for reliable stimuli. Extended characterization and recorded optical flow fields from a 30 × 30 array prototype sensor demonstrate the validity of the optical flow model and the robustness and functionality of the computational architecture and its implementation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.