Abstract

Sensor networks utilize a large number of sensing nodes powered by on-board batteries for much improved surveillance quality. Due to the random and dense sensor deployment, when the entire target area is at k-coverage, a significant portion of it will be covered by more than k sensors. The neighbor selection scheme could find those redundant sensors and put them into the sleep mode for energy conservation purpose, thus prolonging the network lifetime. Depending on the type of sensing modalities used in the network, the neighbor selection method can be very different. Most conventional sensor networks adopt scalar sensors with omni-directional sensing capability and thus the neighborhood depends only on the distance between sensors. The focus of this paper is on neighbor selection in Visual Sensor Networks (VSNs) that consist of a large number of imaging sensors where directional sensing is adopted. Therefore, the neighborhood depends not only on the distance, but also on their orientations and occlusion conditions. In this paper we present a semantic neighbor selection algorithm for VSNs where the semantic neighbor is defined as a group of geographically close visual sensors that capture the same or similar scene. Our semantic neighbor selection is based on the principle of image comparison by using effective feature extraction approach that is both compact and with high accuracy. We develop a so-called Extend Speeded-UP Robust Features (E-SURF) based on two popularly used feature extraction schemes, SURF and SIFT. The E-SURF feature is more compact than SIFT in terms of data volume so that the semantic neighbor selection process would not incur a heavy overhead in communication. It is also more accurate than SURF in terms of finding the right neighbors. To ensure this scheme works well in practical VSNs, we present a protocol design for implementation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.