Abstract
Within the videoconferencing world, systems have existed for some time offering two way, and even multipoint, video, audio and data links. Further work on these systems is concentrated on attempts to convey more feeling of `presence', for example to provide eye contact or to represent participants in a shared virtual space. In the virtual reality community, particularly within the Virtual Reality Markup Language (VRML) group, much work is in progress to define distributed virtual environments. These offer immersive environments, accessible through World Wide Web (WWW) browsers, where personal representations, or avatars, can move about and interact in 3D worlds, both with the world itself and with other avatars. The new VRML 2.0 specification, with Silicon Graphics Moving Worlds proposal, now enables much more dynamic and interactive environments; and software such as Dimension-X's Liquid Reality help produce the Java code needed to give avatars and other virtual entities realistically complex behaviour. The convergence of these technologies is now obvious, and desirable, and the upcoming MPEG-4 standards development will help precisely in this new domain. This paper details the current status of the MPEG-4 standard and discusses its possible application to the field of telepresence in shared virtual reality spaces. (4 pages)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have