In high-speed aerodynamics research, point sensors are ideal for embedding in test models but lack spatial resolution, whereas high-speed cameras offer spatiotemporally resolved measurement but involve significant footprint, cost, and data size. To address these tradeoffs, this study explores the application of nascent event-based cameras for high-speed tests. Event-based cameras support continuous, data-sparse kilohertz-equivalent imaging at 720p resolution, on form factors as small as 36 mm and 40 grams in mass, combining the benefits of point sensors and high-speed cameras. However, these attributes come from asynchronous pixels that necessitate unique operating and postprocessing approaches. Here, the authors adapted event-based cameras for two-/three-dimensional photogrammetric tracking of aeroelastic structures, demonstrating an event-based workflow and two tracking algorithms (mean-shift filtering and circle fit). Bench-top validations achieved three-dimensional precision of 0.35 mm/s on 20 mm/s motion across a 259 mm field of view, while two-dimensional measurements of an aeroelastic titanium panel in Mach 0.76 transonic flow successfully identified millimeter-scale vibrations at 43.7, 120, and 270 Hz, validated against a laser displacement and high-speed camera. The transonic test’s raw data were 145.8 MB on the event-based camera, compared to 88.5 GB on the high-speed camera. The presented results demonstrated the viability of event-based techniques in high-speed aerodynamic testing, while highlighting challenges such as polarity switching and pixel latency.
Read full abstract