Abstract
Recent improvements in extended reality (XR) technology created an increase in XR products and solutions in the industry while raising new requirements for new or improved architectural concepts. This need can be particularly complex as XR applications often relate both to 3D geometric rendering and multimedia paradigms. This article outlines the main concepts relevant to XR, both from a game engineering and a multimedia streaming system perspective. XR requires new metadata and media/game orchestration to allow complex interaction between users, objects, and (volumetric) multimedia content, which also results in new requirements on synchronization (i.e., for global object state and positioning). Furthermore, the article presents the functional blocks needed in new XR system architectures and how they will glue both (game and media) spaces together. The discussion of functional components and architecture relates to the ongoing activities in relevant standardization bodies like Khronos, Moving Picture Experts Group (MPEG) and The 3rd Generation Partnership Project (3GPP). To make XR successful in the long term, the industry needs to agree on interoperable solutions and how to merge both game and media paradigms to allow complex multiuser XR applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.