Abstract
Hybrid Mobile ad hoc network (H-MANET) is a network in which MANET is connected to the Internet using some special nodes called Internet Gateways (IGs). Previous work in this field considered the infinite buffer scenario and therefore buffer overflow would never occur. But in realistic MANET, the buffer size of each node is strictly bounded that incurs packet loss and intensifies delay. In this paper, a mathematical model has been proposed for modeling of multipath delay analysis for IG selection in H-MANET. The proposed work is used to estimate end to end delay of each individual path amongst Source Node (SN) and IGs. The M/M/m/R network model has been employed for modeling the network in which packets arrive in Poisson distribution in a bounded queue with finite buffer capacity. Burke theorem has been used to calculate queuing delay of each individual path as well as the complete path from the SN to an IG. Furthermore, best IG is selected for data transmission based on delay analysis of multiple paths. The Opnet modeler 18.0 has been used to perform the simulation work. The numerical results represent the effectiveness of proposed model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.