Abstract

In this paper, we propose a new video object detection (VoD) method, referred to as temporal feature aggregation and motion-aware VoD (TM-VoD), that produces a joint representation of temporal image sequences and object motion. The TM-VoD generates strong spatiotemporal features for VOD by temporally redundant information in an image sequence and the motion context. These are produced at the feature level in the region proposal stage and at the instance level in the refinement stage. In the region proposal stage, visual features are temporally fused with appropriate weights at the pixel level via gated attention model. Furthermore, pixel level motion features are obtained by capturing the changes between adjacent visual feature maps. In the refinement stage, the visual features are aligned and aggregated at the instance level. We propose a novel feature alignment method, which uses the initial region proposals as anchors to predict the box coordinates for all video frames. Moreover, the instance level motion features are obtained by applying the region of interest (RoI) pooling to the pixel level motion features and by encoding the sequential changes in the box coordinates. Finally, all these instance level features are concatenated to produce a joint representation of the objects. Experiments on the ImageNet VID dataset demonstrate that the proposed method significantly outperforms existing VoDs and achieves performance comparable with that of state-of-the-art VoDs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.