Abstract

With the evolution and fusion of wireless sensor network and embedded camera technologies, distributed smart camera networks have emerged as a new class of systems for wide-area surveillance applications. Wireless networks, however, introduce a number of constraints to the system that need to be considered, notably the communication bandwidth constraints. Existing approaches for target tracking using a camera network typically utilize target handover mechanisms between cameras, or combine results from 2D trackers in each camera into 3D target estimation. Such approaches suffer from scale selection, target rotation, and occlusion, drawbacks typically associated with 2D tracking. In this paper, we present an approach for tracking multiple targets directly in 3D space using a network of smart cameras. The approach employs multi-view histograms to characterize targets in 3D space using color and texture as the visual features. The visual features from each camera along with the target models are used in a probabilistic tracker to estimate the target state. We introduce four variations of our base tracker that incur different computational and communication costs on each node and result in different tracking accuracy. We demonstrate the effectiveness of our proposed trackers by comparing their performance to a 3D tracker that fuses the results of independent 2D trackers. We also present performance analysis of the base tracker along Quality-of-Service (QoS) and Quality-of-Information (QoI) metrics, and study QoS vs. QoI trade-offs between the proposed tracker variations. Finally, we demonstrate our tracker in a real-life scenario using a camera network deployed in a building.

Highlights

  • Smart cameras are evolving on three different evolutionary paths [1]

  • We present an approach for collaborative target tracking in 3D space using a wireless network of smart cameras

  • We model the targets in 3D space and circumvent the problems inherent in the tracker based on 2D target models

Read more

Summary

Introduction

Smart cameras are evolving on three different evolutionary paths [1]. First, single smart cameras focus on integrating sensing with embedded on-camera processing power to perform various vision tasks on-board and deliver abstracted data from the observed scene. Single camera tracking algorithms are often applied in the image plane These image-plane (or 2D) trackers often run into problems such as target scale selection, target rotation, occlusion, view-dependence, and correspondence across views [2]. The above-mentioned problems that are inherent in the image-plane based trackers can be circumvented by employing a tracker in 3D space using a network of smart cameras. The challenges for wireless smart camera networks include robust target tracking against scale variation, rotation and occlusion, especially in the presence of bandwidth constraints due to the wireless communication medium. This paper presents an approach for collaborative target tracking in 3D space using a wireless network of smart cameras.

Related Work
Feature Selection for Tracking
Target Tracking
Tracking with Camera Networks
Target Representation
Target Model
Target Candidate
Similarity Measure
Probabilistic 3D Tracker
Target State
Similarity Measure and Localization
Estimation of Target Orientation
Tracking Algorithm
Tracker Variations
Tracker T1
Tracker T2
Step 1
Step 2
Tracker T3
Tracker T4
Comparison of All Trackers
Performance Evaluation
Simulated Camera Network
LCR Experiments
FGH Experiments
Conclusions
Findings
Background

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.