Abstract

In previous research, we proposed an augmented reality (AR) and mixed reality (MR) system visualizing the sound field data measured by the beamforming method. Our system visualized the sound pressure as a colormap on the AR/MR device in real-time. Free viewpoint and visualization of a colormap along the spatial geometry observed by the AR/MR sensor enabled us to understand the location of the sound source. On the other hand, the spacial geometry information of the measurement space was used only for display purposes and not in the processing of sound source location estimation. And then the previous system displaying a flat colormap projected on the spatial geometry has not been able to make measurements that precisely in space. In this paper, we propose a sound field measurement and AR/MR visualization system using point cloud data acquired by light detection and ranging (LiDAR) sensor. The MVDR (minimum variance distortionless response) beamforming calculates the beamforming output by using point cloud data. The sound sources are assumed to be located on point clouds, and the system allows fast localization of a large number of sources. In addition, it is possible to interpret the sound field in various ways based on the scattered sound pressure information. We show some basic experiments conducted in an anechoic chamber.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call