Abstract
Multimedia documents have to be played on multiple device types. Hence, usage and platform diversity requires document adaptation according to execution contexts, not generally predictable at design time. In an earlier work, a semantic framework for multimedia document adaptation was proposed. In this framework, a multimedia document is interpreted as a set of potential executions corresponding to the author specification. To each target device corresponds a set of possible executions complying with the device constraints. In this context, adapting requires to select an execution that satisfies the target device constraints and which is as close as possible from the initial composition. This theoretical adaptation framework does not specifically consider the main multimedia document dimensions, i.e., temporal, spatial and hypermedia. In this paper, we propose a concrete application of this framework on standard multimedia documents. For that purpose, we first define an abstract structure that captures the spatio-temporal and hypermedia dimensions of multimedia documents, and we develop an adaptation algorithm which transforms in a minimal way such a structure according to device constraints. Then, we show how this can be used for adapting concrete multimedia documents in SMIL through converting the documents in the abstract structure, using the adaptation algorithm, and converting it back in SMIL. This can be used for other document formats without modifying the adaptation algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.