Abstract

We describe an interactive approach for visual object analysis which exploits the ability of a robot to manipulate its environment. Knowledge of objects' mechanical properties is important in a host of robotics tasks, but their measurement can be impractical due to perceptual or mechanical limitations. By applying a periodic stimulus and matched video filtering and analysis pipeline, we show that even stiff, fragile, or low-texture objects can be distinguished based on their mechanical behaviours. We construct a novel, linear filter exploiting periodicity of the stimulus to reduce noise, enhance contrast, and amplify motion by a selectable gain - the proposed filter is significantly simpler than previous approaches to motion amplification. We further propose a set of statistics based on dense optical flow derived from the filtered video, and demonstrate visual object analysis based on these statistics for objects offering low contrast and limited deflection. Finally, we analyze 7 object types over 59 trials under varying illumination and pose, demonstrating that objects are linearly distinguishable under this approach, and establish the viability of estimating fluid level in a cup from the same statistics.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.