Abstract

This paper addresses the optimal rate allocation (ORA) problem as follows: given a target bit rate constraint, determine an optimal rate allocation among sensors such that the overall distortion of the reproduction data is minimized. Optimal rate allocation algorithms are proposed to determine the coding bit rate of each sensor in single hop and multihop sensor networks, given a target rate constraint. Extensive simulations are conducted by using temperature readings of the real world dataset. The results show that at low bit rates the optimal rate allocation improves about 2.745 dB on the uniform rate allocation in terms of SNR, and improves nearly 7.602 in terms of MSE. Spatial-temporal range queries are also evaluated to confirm that our approach is often sufficient to provide approximate statistics for range queries.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.