Abstract
The material property of an object’s surface is critical for the tasks of robotic manipulation or interaction with its surrounding environment. Tactile sensing can provide rich information about the material characteristics of an object’s surface. Hence, it is important to convey and interpret tactile information of material properties to the users during interaction. In this paper, we propose a visual-tactile cross-modal retrieval framework to convey tactile information of surface material for perceptual estimation. In particular, we use tactile information of a new unknown surface material to retrieve perceptually similar surface from an available surface visual sample set. For the proposed framework, we develop a deep cross-modal correlation learning method, which incorporates the high-level nonlinear representation of deep extreme learning machine and class-paired correlation learning of cluster canonical correlation analysis. Experimental results on the publicly available dataset validate the effectiveness of the proposed framework and the method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Machine Learning and Cybernetics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.