Abstract

We propose a robust non-parametric strategy to weight scarce and imperfect ranging information, which is shown to significantly improve the accuracy of distance-based network localization algorithms. The proposed weights have a dispersion component, which captures the effect of noise under the assumption of bias-free samples, and a penalty component, which quantifies the risk of the latter assumption and penalizes it proportionally. The dispersion weights result from the application of small-scale statistics with confidence bounds optimized under a maximum entropy criterion that mathematizes the empirical concept of reliability commonly found in related literature. In turn, the penalty weights are derived from the relationship between the risk incurred by the bias-free assumption and the geometry of 3-node cliques, established by statistical-geometry. The performance of the distance-based network localization algorithm employing the proposed dispersion-penalty weights is compared against the Cramer-Rao lower bound (CRLB) and to equivalent algorithms employing alternative weights. The comparison reveals that, amongst the alternatives, the network localization algorithm with the proposed weights performs best and closest to an unbiased estimator.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.