Abstract

Structured light is a perception method that allows us to obtain 3D information from images of the scene by projecting synthetic features with a light emitter. Traditionally, this method considers a rigid configuration, where the position and orientation of the light emitter with respect to the camera are known and calibrated beforehand. In this paper we propose a new omnidirectional structured light system in flexible configuration, which overcomes the rigidness of the traditional structured light systems. We propose the use of an omnidirectional camera combined with a conic pattern light, i.e., the 3D information of the conic in the space. This reconstruction considers the recovery of the depth and orientation of the scene surface where the conic pattern is projected. One application of our proposed structured light system in flexible configuration consists of a wearable omnicamera with a low-cost laser in hand for visual impaired personal assistance.

Highlights

  • In computer vision, one of the most important goals is to obtain 3D information from the scene.This problem has been studied for many years [1]

  • We explore a new configuration for structured light systems, where both components, the camera and the light emitter are free to move in the space

  • We propose the use of an omnidirectional camera where the light emitter is visible and from which its relative location is partially computable

Read more

Summary

Introduction

One of the most important goals is to obtain 3D information from the scene. The main goal of these systems is to obtain depth and surface orientation from the deformed patterns projected in the scene observed by the vision system. A recently presented approach locates the camera and the light emitter on a mobile platform with fixed position where the deformation of the projected pattern allows the computation of the platform position and orientation [16]. We use the image of the light pattern acquired by the omnidirectional camera and a virtual image generated from the calibrated light emitter to perform the conic reconstruction algorithm From this algorithm we compute the depth and orientation of the surface where the conic pattern has been projected.

Problem Definition
Omnidirectional Camera Model
Conic Laser Model
Conic Correspondence Condition
Depth Information Using a Structured Light System in Flexible Configuration
Computing Laser 3D Location
Laser Orientation
Projection Plane Location
Experiments
Experiments with Simulated Data
Image Processing
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call