Abstract

Background and objectiveThis article presents a multimodal analysis of startle type responses using a variety of physiological, facial, and speech features. These multimodal components of the startle type response reflect complex brain–body reactions to a sudden and intense stimulus. Additionally, the proposed multimodal evaluation of reflexive and emotional reactions associated with the startle eliciting stimuli and underlying neural networks and pathways could be applied in diagnostics of different psychiatric and neurological diseases. Different startle type stimuli can be compared in the strength of their elicitation of startle responses, i.e. their potential to activate stress-related neural pathways, underlying biomarkers and corresponding behavioral reactions. MethodsAn innovative method for measuring startle type responses using multimodal stimuli and multimodal feature analysis has been introduced. Individual's multimodal reflexive and emotional expressions during startle type elicitation have been assessed by corresponding physiological, speech and facial features on ten female students of psychology. Different startle eliciting stimuli like noise and airblast probes, as well as a variety of visual and auditory stimuli of different valence and arousal levels, based on International Affective Picture System (IAPS) images and/or sounds from International Affective Digitized Sounds (IADS) database, have been designed and tested. Combined together into more complex startle type stimuli, such composite stimuli can potentiate the evoked response of underlying neural networks, and corresponding neurotransmitters and neuromodulators as well; this is referred to as increased power of response elicitation. The intensity and magnitude of multimodal responses to selected startle type stimuli have been analyzed using effect sizes and medians of dominant multimodal features, i.e. skin conductance, eye blink, head movement, speech fundamental frequency and energy. The significance of the observed effects and comparisons between paradigms were evaluated using one-tailed t-tests and ANOVA methods, respectively. Skin conductance response habituation was analyzed using ANOVA and post hoc multiple comparison tests with the Dunn–Šidák correction. ResultsThe results revealed specific physiological, facial and vocal reflexive and emotional responses on selected five stimuli paradigms which included: (1) acoustic startle probes, (2) airblasts, (3) IAPS images, (4) IADS sounds, and (5) image-sound-airblast composite stimuli. Overall, composite and airblast paradigms resulted in the largest responses across all analyzed features, followed by sound and acoustic startle paradigms, while paradigm using images consistently elicited the smallest responses. In this context, power of response elicitation of the selected stimuli paradigms can be described according to the aggregated magnitude of the participants’ multimodal responses. We also observed a habituation effect only in skin conductance response to acoustic startle, airblast and sound paradigms. ConclusionsThis study developed a system for paradigm design and stimuli generation, as well as real-time multimodal signal processing and feature calculation. Experimental paradigms for monitoring individual responses to stressful startle type stimuli were designed in order to compare the response elicitation power across various stimuli. The developed system, applied paradigms and obtained results might be useful in further research for evaluation of individuals’ multimodal responses when they are faced with a variety of aversive emotional distractors and stressful situations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.