Advancements in brain imaging techniques have significantly expanded the size and complexity of real-time neuroimaging and behavioral data. However, identifying patterns, trends and synchronies within these datasets presents a significant computational challenge. Here, we demonstrate an approach that can translate time-varying neuroimaging data into unique audiovisualizations consisting of audible representations of dynamic data merged with simplified, color-coded movies of spatial components and behavioral recordings. Multiple variables can be encoded as different musical instruments, letting the observer differentiate and track multiple dynamic parameters in parallel. This representation enables intuitive assimilation of these datasets for behavioral correlates and spatiotemporal features such as patterns, rhythms and motifs that could be difficult to detect through conventional data interrogation methods. These audiovisual representations provide a novel perception of the organization and patterns of real-time activity in the brain, and offer an intuitive and compelling method for complex data visualization for a wider range of applications.
Read full abstract