Abstract. Compared to conventional cameras, event cameras represent a noteworthy advancement in neuromorphic imaging technology, garnering considerable attention from researchers due to their distinct advantages. However, event cameras are susceptible to significant levels of measurement noise, which can detrimentally affect the performance of algorithms reliant on event stream for tasks such as perception and navigation. In this study, we introduce a novel method for denoising event stream, aiming to filter out events that do not accurately reflect genuine logarithmic intensity changes within the observed scene. Our approach focuses on the asynchronous nature and spatiotemporal properties of events, culminating in the development of a novel Asynchronous Spatio- Temporal Event Denoising neural Network(ASTEDNet). This network operates directly on event streams, circumventing the need to convert event stream into denser formats like image frames, thereby preserving their inherent asynchronous nature. Drawing upon principles from graph encoding and temporal convolutional networks, we incorporate spatiotemporal feature attention mechanisms to capture the temporal and spatial correlations between events. This enables the classification of each active event pixel in the original stream as either representing a genuine intensity change or noise. Comparative evaluations conducted on multiple datasets against state-of-the-art methods demonstrate the remarkable efficacy and robustness of our proposed algorithm in noise removal while retaining meaningful event information within the scene.
Read full abstract