Abstract
Multimedia Sensor Networks (MSNs) have gained increasing attention in recent years from both academic and industrial sectors. Unlike scalar sensor networks, the data collected from MSNs are enriched with multimedia data which can be used for defining and detecting complex and more application-meaningful events. However, to do so, there are several processing tasks need to be executed such as multimedia data decoding, translating semantic information from multimedia data, and integrating multimedia data from several sensors. Combining these tasks into one single generic framework so to process and detect complex events in MSNs is of great interest. However, developing such a framework is challenging due to the infrastructure of MSNs (which includes heterogeneous sensors) and types of multimedia data (which are diverse). Also, events in MSNs are needed to be detected in a near real-time manner. In this study, we propose an ontology-based framework to support complex event modeling and detecting in MSNs. Our framework helps users model MSNs infrastructure, complex events, and data collected from MSNs. It is also able of translating semantic formation, detecting, and reporting the events in a near real-time manner. Our framework is validated by means of prototyping and simulation. The results show that it can detect complex multimedia events in a high-work load scenario with average detection latency for less than 625 milliseconds.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.