One of human brain's remarkable traits lies in its capacity to dynamically coordinate the activities of multiple brain regions or networks, adapting to an externally changing environment. Studying the dynamic functional brain networks (DFNs) and their role in perception, assessment, and action can significantly advance our comprehension of how the brain responds to patterns of sensory input. Movies provide a valuable tool for studying DFNs, as they offer a naturalistic paradigm that can evoke complex cognitive and emotional experiences through rich multimodal and dynamic stimuli. However, most previous research on DFNs have predominantly concentrated on the resting-state paradigm, investigating the topological structure of temporal dynamic brain networks generated via chosen templates. The dynamic spatial configurations of the functional networks elicited by naturalistic stimuli demand further exploration. In this study, we employed an unsupervised dictionary learning and sparse coding method combing with a sliding window strategy to map and quantify the dynamic spatial patterns of functional brain networks (FBNs) present in naturalistic functional magnetic resonance imaging (NfMRI) data, and further evaluated whether the temporal dynamics of distinct FBNs are aligned to the sensory, cognitive, and affective processes involved in the subjective perception of the movie. The results revealed that movie viewing can evoke complex FBNs, and these FBNs were time-varying with the movie storylines and were correlated with the movie annotations and the subjective ratings of viewing experience. The reliability of DFNs was also validated by assessing the Intra-class coefficient (ICC) among two scanning sessions under the same naturalistic paradigm with a three-month interval. Our findings offer novel insight into comprehending the dynamic properties of FBNs in response to naturalistic stimuli, which could potentially deepen our understanding of the neural mechanisms underlying the brain's dynamic changes during the processing of visual and auditory stimuli.
Read full abstract