Effect of visual information on subjective impression for sound field in architectural space
In architectural and urban space, we are always exposed to multimodal stimuli of visual information and sound fields in various scenes of everyday life. The purpose of this study is to clarify relationship of subjective impression of vision and auditory, and acquire knowledge which contributes to architectural design or acoustic design. In this report, two following experiments are carried out in which subjects are presented with sound fields by real time convolution as auditory stimuli and panoramic VR images of 360 interactive views of interior as visual stimuli. (1) Comparison between subjective responses for single or multi modal presentations of visual and auditory stimuli from various architectural spaces. (2) Comparison between subjective responses for various combinations of multi modal presentations of visual and auditory stimuli from various architectural spaces. Analysis for results of these experiments clarify the influence of visual information upon the subjective impression for sound field and mutual relationship between subjective impression of vision and auditory of interior of buildings. It is already found that visual information significantly effects subjective impression for sound field by experiment 1, and the details of relationship between elements of visual information and parameters of sound field will be clarified by experiment 2.
- Conference Article
1
- 10.1121/1.4798982
- Jan 1, 2013
In architectural and urban space, we are always exposed to multimodal stimuli of visual information and sound fields in various scenes of everyday life. The purpose of this study is to clarify relationship of subjective impression of vision and auditory, and acquire knowledge which contributes to architectural design or acoustic design. In this report, two following experiments are carried out in which subjects are presented with sound fields by real time convolution as auditory stimuli and panoramic VR images of 360 interactive views of interior as visual stimuli. 1.Comparison between subjective responses for single or multi modal presentations of visual and auditory stimuli from various architectural spaces. 2.Comparison between subjective responses for various combinations of multi modal presentations of visual and auditory stimuli from various architectural spaces. Analysis for results of these experiments clarify the influence of visual information upon the subjective impression for sound field and mutual relationship between subjective impression of vision and auditory of interior of buildings. It is already found that visual information significantly effects subjective impression for sound field by experiment 1, and the details of relationship between elements of visual information and parameters of sound field will be clarified by experiment 2.
- Research Article
7
- 10.3758/s13428-021-01663-w
- Aug 3, 2021
- Behavior Research Methods
Virtual reality (VR) is a new methodology for behavioral studies. In such studies, the millisecond accuracy and precision of stimulus presentation are critical for data replicability. Recently, Python, which is a widely used programming language for scientific research, has contributed to reliable accuracy and precision in experimental control. However, little is known about whether modern VR environments have millisecond accuracy and precision for stimulus presentation, since most standard methods in laboratory studies are not optimized for VR environments. The purpose of this study was to systematically evaluate the accuracy and precision of visual and auditory stimuli generated in modern VR head-mounted displays (HMDs) from HTC and Oculus using Python 2 and 3. We used the newest Python tools for VR and Black Box Toolkit to measure the actual time lag and jitter. The results showed that there was an 18-ms time lag for visual stimulus in both HMDs. For the auditory stimulus, the time lag varied between 40 and 60 ms, depending on the HMD. The jitters of those time lags were 1 ms for visual stimulus and 4 ms for auditory stimulus, which are sufficiently low for general experiments. These time lags were robustly equal, even when auditory and visual stimuli were presented simultaneously. Interestingly, all results were perfectly consistent in both Python 2 and 3 environments. Thus, the present study will help establish a more reliable stimulus control for psychological and neuroscientific research controlled by Python environments.
- Research Article
- 10.21653/tjpr.971297
- Aug 20, 2022
- Türk Fizyoterapi ve Rehabilitasyon Dergisi
Purpose: The role of visual stimuli as the primary stimulus and the effect of auditory stimulus before external perturbation on Anticipatory Postural Adjustments (APAs) releases has been investigated, but which type of stimulus (auditory or visual) before perturbation has a more significant effect on initial APAs release, needs to be investigated. So, this study aimed to investigate the role of visual-auditory contributions before external perturbation on APAs and the effect of stimulus presentation on the release of APAs at different time intervals.
 Methods: Participants in this study were fourteen physical education students (Meanage 22.4 ± 2.14 years) exposed to five trials of visual stimulus and five trials of an audible stimulus (80 dB) while standing on the Biodex balance sheet. Then, 1.4 seconds after presenting the stimulus, external perturbation was applied. Electromyography (EMG) activity of the postural muscles was recorded during all trials. APAs were extracted at intervals of -100 to 50 ms (APA1), 50 to 200 ms (APA2), and 200 to 350 ms (APA3). The mixed ANOVA and repeated measures analysis of variance with Bonferroni correction test were used for data analysis.
 Results: The results showed that the presentation of visual and auditory stimuli increased the APAs of the postural muscles. According to these results, APA3 was greater than APA2 and APA2 than APA1(P≤0.05). Also, the results showed that auditory stimulus increased the APAs of the postural muscles more than the visual stimulus (P≤0.05).
 Conclusion: Generally, the researchers concluded that providing an auditory stimulus before perturbation has a greater effect on APA than a visual stimulus in healthy young girls. Therefore, it is suggested that in order to prevent imbalance or maintain greater balance, auditory stimuli with appropriate intensity can be used. Furthermore, subsequent research on this topic could include comparing APA release under the influence of visual and auditory stimuli in men and women, athletes and non-athletes, and healthy individuals with individuals with mobility impairments.
- Research Article
6
- 10.3389/fnhum.2013.00809
- Nov 25, 2013
- Frontiers in Human Neuroscience
Objectives: Functional magnetic resonance imaging (fMRI) is a reliable and non-invasive method with which to localize language function in pre-surgical planning. In clinical practice, visual stimulus presentation is often difficult or impossible, due to the patient's restricted language or attention abilities. Therefore, our aim was to investigate modality-specific differences in visual and auditory stimulus presentation.Methods: Ten healthy subjects participated in an fMRI study comprising two experiments with visual and auditory stimulus presentation. In both experiments, two language paradigms (one for language comprehension and one for language production) used in clinical practice were investigated. In addition to standard data analysis by the means of the general linear model (GLM), independent component analysis (ICA) was performed to achieve more detailed information on language processing networks.Results: GLM analysis revealed modality-specific brain activation for both language paradigms for the contrast visual > auditory in the area of the intraparietal sulcus and the hippocampus, two areas related to attention and working memory. Using group ICA, a language network was detected for both paradigms independent of stimulus presentation modality. The investigation of language lateralization revealed no significant variations. Visually presented stimuli further activated an attention-shift network, which could not be identified for the auditory presented language.Conclusion: The results of this study indicate that the visually presented language stimuli additionally activate an attention-shift network. These findings will provide important information for pre-surgical planning in order to preserve reading abilities after brain surgery, significantly improving surgical outcomes. Our findings suggest that the presentation modality for language paradigms should be adapted on behalf of individual indication.
- Research Article
36
- 10.1186/1471-2202-8-14
- Feb 6, 2007
- BMC Neuroscience
BackgroundRecent findings of a tight coupling between visual and auditory association cortices during multisensory perception in monkeys and humans raise the question whether consistent paired presentation of simple visual and auditory stimuli prompts conditioned responses in unimodal auditory regions or multimodal association cortex once visual stimuli are presented in isolation in a post-conditioning run. To address this issue fifteen healthy participants partook in a "silent" sparse temporal event-related fMRI study. In the first (visual control) habituation phase they were presented with briefly red flashing visual stimuli. In the second (auditory control) habituation phase they heard brief telephone ringing. In the third (conditioning) phase we coincidently presented the visual stimulus (CS) paired with the auditory stimulus (UCS). In the fourth phase participants either viewed flashes paired with the auditory stimulus (maintenance, CS-) or viewed the visual stimulus in isolation (extinction, CS+) according to a 5:10 partial reinforcement schedule. The participants had no other task than attending to the stimuli and indicating the end of each trial by pressing a button.ResultsDuring unpaired visual presentations (preceding and following the paired presentation) we observed significant brain responses beyond primary visual cortex in the bilateral posterior auditory association cortex (planum temporale, planum parietale) and in the right superior temporal sulcus whereas the primary auditory regions were not involved. By contrast, the activity in auditory core regions was markedly larger when participants were presented with auditory stimuli.ConclusionThese results demonstrate involvement of multisensory and auditory association areas in perception of unimodal visual stimulation which may reflect the instantaneous forming of multisensory associations and cannot be attributed to sensation of an auditory event. More importantly, we are able to show that brain responses in multisensory cortices do not necessarily emerge from associative learning but even occur spontaneously to simple visual stimulation.
- Research Article
200
- 10.1016/j.cub.2011.11.039
- Dec 15, 2011
- Current Biology
When Correlation Implies Causation in Multisensory Integration
- Research Article
25
- 10.1038/srep26188
- May 1, 2016
- Scientific Reports
Detecting and integrating information across the senses is an advantageous mechanism to efficiently respond to the environment. In this study, a simple auditory-visual detection task was employed to test whether pupil dilation, generally associated with successful target detection, could be used as a reliable measure for studying multisensory integration processing in humans. We recorded reaction times and pupil dilation in response to a series of visual and auditory stimuli, which were presented either alone or in combination. The results indicated faster reaction times and larger pupil diameter to the presentation of combined auditory and visual stimuli than the same stimuli when presented in isolation. Moreover, the responses to the multisensory condition exceeded the linear summation of the responses obtained in each unimodal condition. Importantly, faster reaction times corresponded to larger pupil dilation, suggesting that also the latter can be a reliable measure of multisensory processes. This study will serve as a foundation for the investigation of auditory-visual integration in populations where simple reaction times cannot be collected, such as developmental and clinical populations.
- Abstract
- 10.1016/j.jns.2019.10.1035
- Oct 1, 2019
- Journal of the Neurological Sciences
New characteristics of neurophysiological changes in neurocognitive disorders in HIV-infected patients in Uzbekistan
- Research Article
36
- 10.1371/journal.pone.0004844
- Mar 16, 2009
- PLoS ONE
BackgroundIt is well-known that human beings are able to associate stimuli (novel or not) perceived in their environment. For example, this ability is used by children in reading acquisition when arbitrary associations between visual and auditory stimuli must be learned. The studies tend to consider it as an “implicit” process triggered by the learning of letter/sound correspondences. The study described in this paper examined whether the addition of the visuo-haptic exploration would help adults to learn more effectively the arbitrary association between visual and auditory novel stimuli.Methodology/Principal FindingsAdults were asked to learn 15 new arbitrary associations between visual stimuli and their corresponding sounds using two learning methods which differed according to the perceptual modalities involved in the exploration of the visual stimuli. Adults used their visual modality in the “classic” learning method and both their visual and haptic modalities in the “multisensory” learning one. After both learning methods, participants showed a similar above-chance ability to recognize the visual and auditory stimuli and the audio-visual associations. However, the ability to recognize the visual-auditory associations was better after the multisensory method than after the classic one.Conclusion/SignificanceThis study revealed that adults learned more efficiently the arbitrary association between visual and auditory novel stimuli when the visual stimuli were explored with both vision and touch. The results are discussed from the perspective of how they relate to the functional differences of the manual haptic modality and the hypothesis of a “haptic bond” between visual and auditory stimuli.
- Research Article
- 10.3389/conf.fnins.2010.03.00303
- Jan 1, 2010
- Frontiers in Neuroscience
Event Abstract Back to Event Is multisensory integration Hebbian? Ventriloquism aftereffect w/o simultaneous audiovisual stimuli Daniel Pages1* and Jennifer M. Groh1 1 Duke University, United States Visual stimuli affect the perceived location of sounds. It has been assumed that the neural mechanism supporting visual recalibration of perceived sound location involves a simple Hebbian mechanism, where simultaneously presented auditory and visual stimuli excite a common population of neurons and ’wire’ the auditory stimulus to a new location. However, an alternative possibility is that visual error after auditory localization could be used to ’update’ auditory space via a feedback mechanism. Under this view, what you see after you make an eye movement to a sound would play a critical role in whether/how you adjust your sense of sound location. Previous studies of the effects of vision on sound localization have allowed for both possibilities, because visual and auditory stimuli have generally been presented simultaneously, potentially permitting Hebbian associations to form, and have also been left on long enough for visual feedback to be provided following any orienting movements to the sounds, permitting plasticity to be guided by visual reinforcement. Prism adaptation experiments such as those conducted in barn owls could involve either or both mechanisms. In the present study we seek to distinguish between these possibilities by introducing a ventriloquism aftereffect - a persistent shift in the perceived location of sounds following exposure to spatially mismatched visual and auditory stimuli - using tasks permitting only one of these mechanisms to operate. Specifically, the Hebbian task involved simultaneous but short-duration visual and auditory stimuli. The visual and auditory stimuli were both turned off prior to the completion of a saccadic eye movement to the sound. In contrast, in the feedback task, the visual and auditory stimuli were never on simultaneously. Rather, the sound played first and a visual stimulus was turned on during the saccade to the sound. We tested the impact of the exposure to these two types of mismatched visual-auditory trials on the accuracy of sound localization on interleaved auditory-only trials in monkeys. We found a robust shift in auditory localization in the feedback paradigm and not in the Hebbian paradigm. The average shift in the feedback paradigm was approximately 1.2 degrees, or 20%of the 6 degree separation between the visual and auditory stimuli. Our results indicate that a feedback signal is used for visually-guided auditory plasticity in the rhesus macaque, and that coincident stimuli are not necessary. More broadly, our results show that important and behaviorally relevant interactions between sensory modalities do not require the presence of stimuli that are coincident in time. Conference: Computational and Systems Neuroscience 2010, Salt Lake City, UT, United States, 25 Feb - 2 Mar, 2010. Presentation Type: Poster Presentation Topic: Poster session II Citation: Pages D and Groh JM (2010). Is multisensory integration Hebbian? Ventriloquism aftereffect w/o simultaneous audiovisual stimuli. Front. Neurosci. Conference Abstract: Computational and Systems Neuroscience 2010. doi: 10.3389/conf.fnins.2010.03.00303 Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters. The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated. Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed. For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions. Received: 07 Mar 2010; Published Online: 07 Mar 2010. * Correspondence: Daniel Pages, Duke University, Durham, United States, dspages@gmail.com Login Required This action requires you to be registered with Frontiers and logged in. To register or login click here. Abstract Info Abstract The Authors in Frontiers Daniel Pages Jennifer M Groh Google Daniel Pages Jennifer M Groh Google Scholar Daniel Pages Jennifer M Groh PubMed Daniel Pages Jennifer M Groh Related Article in Frontiers Google Scholar PubMed Abstract Close Back to top Javascript is disabled. Please enable Javascript in your browser settings in order to see all the content on this page.
- Research Article
10
- 10.1016/s0163-1047(80)92315-8
- Mar 1, 1980
- Behavioral and Neural Biology
Auditory and visual stimuli as reinforcers among lovebirds ( Agapornis roseicollis)
- Research Article
74
- 10.1016/0006-8993(86)90546-9
- Mar 1, 1986
- Brain Research
Single unit response of noradrenergic, serotonergic and dopaminergic neurons in freely moving cats to simple sensory stimuli
- Research Article
20
- 10.2527/jas1988.663661x
- Jan 1, 1988
- Journal of Animal Science
Auditory, chemical and visual stimuli were used in a factorial trial in an attempt to stimulate feeding in newly weaned piglets. Ninety-six crossbred piglets weaned at 28 d of age were assigned to groups containing four littermates. Each group was placed in a 1.2-m X 1.2-m pen in an isolated room for 48 h. Pens were equipped with nipple waterers and trough-type feeders. The auditory stimulus was piglet and sow nursing vocalizations. A visual stimulus was provided by a lamp that illuminated the feeding area. Auditory and visual stimuli were presented for 5 min once an hour for 48 h. The chemical stimulus consisted of 60 ml of evaporated milk sprayed over the surface of the feed once every 12 h. Water was used in place of milk in control treatments. Piglets were videotaped for 48 h. Frequency and duration of feeding, drinking and lying were recorded for two piglets out of each pen. Auditory stimuli increased (P less than .05) the number of drinking bouts per day from 16.3 to 19.2 and the number of drinking bouts associated with stimulus presentation from 6.2 to 8.0. There also was an auditory X day interaction effect on total time spent feeding. On d 2 postweaning piglets in the auditory treatment group spent more (P less than .05) time feeding than did those without auditory stimulation (127.1 vs 104.2 min, respectively). The relatively simple visual and chemical stimuli tested had no significant effects on ingestive behavior.(ABSTRACT TRUNCATED AT 250 WORDS)
- Research Article
- 10.55782/ane-1994-1010
- Jun 30, 1994
- Acta Neurobiologiae Experimentalis
Habituation of the effects elicited by presentation of novel auditory (wide band noise) and visual (darkness) stimuli on on-going bar pressing for food was studied in 48 male hooded rats. Novel stimuli elicited a decrease of the bar press rate. This attenuating effect was the strongest on the first onset of the stimulus of a given modality and then slowly decayed during the stimulus action. The effect from the noise stimulus habituated more rapidly than that elicited by darkness. Then, noise onset enhanced bar pressing, and termination of the noise decreased the response rate. In contrast, termination of the darkness increased the response rate. The difference between auditory and visual stimuli in rapidity of change from attenuating to facilitating effects was more evident for shorter than for longer stimuli durations. Summation of data from repetitive presentations revealed an overall attenuating effect of the visual stimulus and a facilitating effect of the auditory stimulus on bar press rate.
- Research Article
292
- 10.1016/s0006-8993(97)00265-5
- Jun 1, 1997
- Brain Research
Burst activity of ventral tegmental dopamine neurons is elicited by sensory stimuli in the awake cat
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.