Abstract

The advent of high-speed network and high performance PCs has prompted research on networked telepresence, which allows a user to see virtualized real scenes in remote places. View-dependent representation, which provides a user with arbitrary view images using an HMD or an immersive display, is especially effective in creating a rich telepresence. The goal of our work is to realize a networked novel view telepresence system which enables multiple users to control the viewpoint and view-direction independently by virtualizing real dynamic environments. In this paper, we describe a novel view generation method from multiple omni-directional images captured at different positions. We mainly describe our prototype system with highscalability which enables multiple users to use the system simultaneously and some experiments with the system. The novel view telepresence system constructs a virtualized environment from real live videos. The live videos are transferred to multiple users by using multi-cast protocol without increasing network traffic. The system synthesizes a view image for each user with a varying viewpoint and view-direction measured by a magnetic sensor attached to an HMD and presents the generated view on the HMD. Our system can generate the user’s view image in real-time by giving correspondences among omni-directional images and estimating camera intrinsic and extrinsic parameters in advance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call