Abstract

The event camera, a new bio-inspired vision sensor with low latency and high temporal resolution, has brought great potential and demonstrated a promising application in machine vision and artificial intelligence. Corner detection is a key step of object motion estimation and tracking. However, most existing event-based corner detectors, such as G-eHarris and Arc*, lead to a huge number of redundant or wrong corners, and cannot strike a balance between the accuracy and real-time performance, especially in complex scenes with high texture that require higher computational costs. To address these issues, we propose an asynchronous corner detection method: a double threshold filter with Sigmoid eHarris (DTFS-eHarris) and an asynchronous corner tracker. The main contributions are that a double threshold filter is designed to reduce the redundant events and the improved Sigmoid function is utilized to represent the Surface of Active Events (Sigmoid*-SAE). We selected four scenes—shapes, dynamic, poster and boxes—from the public event camera dataset DAVIS240C to compare with the existing state-of-the-art hybrid method; our method has shown more than a 10% reduction in false positive rate and a 5% and 20% improvement in accuracy and throughput, respectively. The evaluations indicate that DTFS-eHarris shows a significant improvement, especially in complex scenes. Thus, it is anticipated to enhance the real-time performance and feature detection accuracy for future robotic applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call