Abstract

The QoE (Quality of Experience), which is perceptual quality for the users, is the most important QoS (Quality of Service) among those at all levels since the users are the ultimate recipients of the services. Even in mobile ad hoc networks (MANET), provision of high QoE is one of the most important issues. Some applications of ad hoc networks require the ability to support real-time multimedia streaming such as live audio and video over the networks. Therefore, the realization of this type of service with high quality is highly demanded; nevertheless, it is very difficult to achieve high quality in ad hoc networks. The cross-layer design architecture (Srivastava & Motani, 2005) is expected as an approach to high quality communication in ad hoc networks. The architecture exploits interaction among more than two layers. Although the layered architecture in IP-based networks has some advantages such as reduction of network design complexity, it is not well suited to wireless networks. This is because the nature of the wireless medium makes it difficult to decouple the layers. There are many studies on the cross-layer design architecture for multimedia streaming. The number of hops maintained by the routing protocol is used for selecting the video coding rate to the network capacity (Gharavi & Ban, 2004), (Zhao et al., 2006). If there are many hops from the sender to the receiver, the approach reduces the coding rate at the sender. It is a cross-layer design between the network and application layers. Abd El Al et al. (2006) have proposed an error recovery mechanism for real-time video streaming that combines FEC and multipath retransmission. This scheme determines strength of the error correction code and a quantization parameter for video encoding according to the number of hops. Frias et al. (2005) exploit the multipath routing protocol for scheduling prioritized video streams and best effort traffic. They schedule the traffic on the basis of the number of multiple routes. Nunome & Tasaka (2005) have proposed the MultiPath streaming scheme with Media Synchronization control (MPMS). It treats audio and video as two separate transport streams and sends the two streams to different routes if multipath routes are available. Furthermore, in order to remedy the temporal structure of the media streams disturbed by the multipath transmission, media synchronization control is employed; it is application-level QoS control.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call