Abstract
A multi-aperture optical flow estimation method for an artificial compound eye
Highlights
Modern robotics imposes the solution of complex tasks such as industrial operations [27,22,56], speech recognition [18], machine-human interaction [30] and navigation [47,21,23]
Among the optical flow results submitted to Sintel dataset based on variational method, EpicFlow ranks 17th, DeepFlow ranks 20th, Classic+NL ranks 43th, and LDOF ranks 50th
The evaluation of the results is based on Middlebury datasets, Sintel datasets and real data captured by eCley, and the performances of benchmark datasets with ground truth are evaluated with the standard metrics, the average angular error (AAE) and average endpoint error (EPE)
Summary
Modern robotics imposes the solution of complex tasks such as industrial operations [27,22,56], speech recognition [18], machine-human interaction [30] and navigation [47,21,23] Within this context, machine vision tasks are very challenging and fundamental, see [16,26]. Since the eCley was proposed several years ago, only a few researchers have adopted this kind of camera in the area of computer vision This is partially because the optical axes between adjacent apertures of eCley have an small offset to obtain a larger field of view (FOV), which leads to the oblique incidence of the marginal aperture. The biggest challenge for using the eCley is that the small size and small FOV of aperture image resulting in little context to support the corresponding field inferencing
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.