Abstract

Recent improvements in extended reality (XR) technology created an increase in XR products and solutions in the industry while raising new requirements for new or improved architectural concepts. This need can be particularly complex as XR applications often relate both to 3D geometric rendering and multimedia paradigms. This article outlines the main concepts relevant to XR, both from a game engineering and a multimedia streaming system perspective. XR requires new metadata and media/game orchestration to allow complex interaction between users, objects, and (volumetric) multimedia content, which also results in new requirements on synchronization (i.e., for global object state and positioning). Furthermore, the article presents the functional blocks needed in new XR system architectures and how they will glue both (game and media) spaces together. The discussion of functional components and architecture relates to the ongoing activities in relevant standardization bodies like Khronos, Moving Picture Experts Group (MPEG) and The 3rd Generation Partnership Project (3GPP). To make XR successful in the long term, the industry needs to agree on interoperable solutions and how to merge both game and media paradigms to allow complex multiuser XR applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call