Abstract

Discriminating between line-of-sight (LOS) and non-line-of-sight (NLOS) conditions, or LOS identification , is important for a variety of purposes in wireless systems, including localization and channel modeling. LOS identification is especially challenging in vehicle-to-vehicle (V2V) networks since a variety of physical effects that occur at different spatial/temporal scales can affect the presence of LOS. This paper investigates machine learning techniques for LOS identification in V2V networks using an extensive set of measurement data and then develops robust and efficient identification solutions. Our approach exploits several static and time-varying features of the channel impulse response (CIR), which are shown to be effective. Specifically, we develop a fast identification solution that can be trained by using the power angular spectrum. Moreover, based on the measurement data, we also compare three different machine learning methods, i.e., support vector machine, random forest, and artificial neural network, in terms of their ability to train and generate the classifier. The results of our experiments conducted under various V2V environments, which were then validated using $K$ -fold cross-validation, show that our techniques can distinguish the LOS/NLOS conditions with an error rate as low as 1%. In addition, we investigate the impact of different training and validating strategies on the identification accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.