Abstract
There has been an explosion in the number of wireless devices and deployment of wireless networks. Successful deployment of wireless networks depends on proper planning and design. One of the important aspects during the planning phase is to determine the extent at which diffraction (scatter) and reflection of propagated signals has an influence on path loss. The best way to do so, is to model the signal's path loss as it travels from the emitter to the receiver. In this work we propose a model to predict path loss for wireless networks based on the radiosity method commonly implemented in the quest for realism in representing images in the computer graphics field. We created the radiosity path loss model and used it to compute I signal strength and path loss values at a varying distances. We also validated the model using an experiment carried out using inSSIDer software. The experiment was repeated twice to ascertain the consistence of results. At 95% level of confidence and 58 degrees of freedom, a t-student test revealed that the difference between obtained values for both experiments was insignificant. Hence the study showed that all values that were captured during each experiment were stable and reliable. The results further showed that our model accurately predicted path loss. This was demonstrated through curve fitting of the model results and the experimental result, which proved to be a best fit.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.