Abstract Using sonification on scientific data analysis provides additional dimensions to visualization, potentially increasing researchers’ analytical capabilities and fostering inclusion and accessibility. This research explores the potential of multimodal Integral Field Spectroscopy (IFS) applied to galaxy analysis through the development and evaluation of a tool that complements the visualization of datacubes with sound. The proposed application, ViewCube, provides interactive visualizations and sonifications of spectral information across a two-dimensional field-of-view, and its architecture is designed to incorporate future sonification approaches. The first sonification implementation described in this article uses a deep learning module to generate binaural unsupervised auditory representations. The work includes a qualitative and quantitative user study based on an online questionnaire, aimed at both specialized and non-specialized participants, focusing on the case study of datacubes of galaxies from the Calar Alto Integral Field Spectroscopy Area (CALIFA) survey. Out of 67 participants who completed the questionnaire, 42 had the opportunity to test the application in person prior to filling out the online survey. 81 per cent of these 42 participants expressed the good interactive response of the tool, 79.1 per cent of the complete sample found the application ‘Useful’, and 58.2 per cent rated its aesthetics as ‘Good’. The quantitative results suggest that all participants were able to retrieve information from the sonifications, pointing to previous experience in the analysis of sound events as more helpful than previous knowledge of the data for the proposed tasks, and highlighting the importance of training and attention to detail for the understanding of complex auditory information.
Read full abstract