Abstract

Event recognition is among one of the popular areas of smart cities that has attracted great attention for researchers. Since Internet of Things (IoT) is mainly focused on scalar data events, research is shifting towards the Internet of Multimedia Things (IoMT) and is still in infancy. Presently multimedia event-based solutions provide low response-time, but they are domain-specific and can handle only familiar classes (bounded vocabulary). However multiple applications within smart cities may require processing of numerous familiar as well as unseen concepts (unbounded vocabulary) in the form of subscriptions. Deep neural network-based techniques are popular for image recognition, but have the limitation of training of classifiers for unseen concepts as well as the requirement of annotated bounding boxes with images. In this work, we explore the problem of training of classifiers for unseen/unknown classes while reducing response-time of multimedia event processing (specifically object detection). We proposed two domain adaptation based models while leveraging Transfer Learning (TL) and Large Scale Detection through Adaptation (LSDA). The preliminary results show that proposed framework can achieve 0.5 mAP (mean Average Precision) within 30 min of response-time for unseen concepts. We expect to improve it further using modified LSDA while applying fastest classification (MobileNet) and detection (YOLOv3) network, along with elimination of requirement of annotated bounding boxes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.