Abstract

Haptic skin palpation with three-dimensional skin surface reconstruction from in vivo skin images in order to acquire both tactile and visual information has been receiving much attention. However, the depth estimation of skin surface, using a light field camera that creates multiple images with a micro-lens array, is a difficult problem due to low-resolution images resulting in erroneous disparity matching. Multiple low-resolution images decoded from a light field camera have limitations to accurate 3D surface reconstruction needed for haptic palpation. To overcome this, a deep learning method, Generative Adversarial Networks, was employed to generate super-resolved skin images that preserve surface detail without blurring, and then, accurate skin depth was estimated by taking multiple subsequent steps including lens distortion correction, sub-pixel shifted image generation using phase shift theorem, cost-volume building, multi-label optimization, and hole filling and refinement, which is a new approach for 3D skin surface reconstruction. Experimental results of the deep-learning-based super-resolution method demonstrated that the textural detail (wrinkles) of super-resolved skin images is well preserved, unlike other super-resolution methods. In addition, the depth maps computed with our proposed algorithm verify that our method can produce more accurate and robust results compared to other state-of-the-art depth map computation methods. Herein, we first proposed depth map estimation of skin surfaces using a light field camera and subsequently tested it with several skin images. The experimental results established the superiority of the proposed scheme.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.