Abstract

Mismatch negativity (MMN) is generated by sounds violating a regular sequence of events including simple regularities related to repetition of acoustic features (e.g. a tone frequency) as well as abstract regularities hidden in stimuli relationships (e.g. a pattern of frequency change). The aim of present study was to investigate whether time interval between the sound events, being either shorter or longer than temporal window of integration (TWI), affects the dissociated processing of simple versus pattern regularities along the auditory deviance detection system. MMN was recorded in healthy young adults during simple and pattern frequency oddball paradigms, using two different SOAs of 180 and 270ms, and the MMN topographic distribution was compared using global dissimilarity index (DISS). There was a significant difference between simple and pattern MMN topographies using an SOA of 270ms (DISS=0.8349, p<0.05) but no significant difference was found with an SOA of 180ms (DISS=0.2516, p=0.84). These results indicate that timing can modulate the dissociation between the cortical areas involved in simple and pattern regularity encoding so that at SOAs shorter than TWI, the processing of these regularities share more similar neural circuits compared to the long SOAs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call