Abstract

Diverse cortical structures are known to coordinate activity as a network in relaying and processing of visual information to discriminate visual objects. However, how this discrimination is achieved is still largely unknown. To contribute to answering this question, we used face-house categorization tasks with three levels of noise in face and house images in functional magnetic resonance imaging (fMRI) experiments involving thirty-three participants. The behavioral performance error and response time (RT) were correlated with noise in face-house images. We then built dynamical causal models (DCM) of fMRI blood-oxygenation level dependent (BOLD) signals from the face and house category-specific regions in ventral temporal (VT) cortex, the fusiform face area (FFA) and parahippocampal place area (PPA), and the dorsolateral prefrontal cortex (dlPFC). We found a strong feed-forward intrinsic connectivity pattern from FFA and PPA to dlPFC. Importantly, the feed-forward connectivity to dlPFC was significantly modulated by the perception of both faces and houses. The dlPFC-BOLD activity, the connectivity from FFA and PPA to the dlPFC all increased with noise level. These results suggest that the FFA-PPA-dlPFC network plays an important role for relaying and integrating competing sensory information to arrive at perceptual decisions.

Highlights

  • Humans are efficient in perceiving and discriminating the visual objects

  • Perception of faces showed stronger response in the fusiform face area (FFA; Kanwisher et al, 1997) and that of house in the parahippocampal place area (PPA; Aguirre et al, 1998; Epstein and Kanwisher, 1998; Haxby et al, 2001; Vuilleumier et al, 2001) and interaction between these regions is important in perception of face and house (Stephan et al, 2008)

  • Relatively recent studies in the field have shown that the representation of visual information in these areas, called core system, is not sufficient (Marotta, 2001; Avidan et al, 2005; Schiltz et al, 2006; Avidan and Behrmann, 2009), and further processing of visual information in the higher order cortical area, called the extended system, is crucial to discriminate visual objects (Fairhall and Ishai, 2007; Heekeren et al, 2008; Avidan and Behrmann, 2009)

Read more

Summary

Introduction

Humans are efficient in perceiving and discriminating the visual objects. How does the brain receive, relay, and integrate relevant sensory information to make such perception and discrimination known as perceptual decision? what are the brain regions involved and how do these regions coordinate activity in perceptual decision-making processes? Previous studies showed that the brain areas on the ventral visual pathway process object categoryspecific visual information (Kanwisher et al, 1997; Haxby et al, 2000, 2001, 2002; Engell and Mccarthy, 2010). The encoding of relevant sensory information is one of the main steps of the brain processes in the cognitive chain leading to perceptual decisions Experiments on both humans and non-human primates have demonstrated that the first stage of perceptual decision-making involves lower order regions receiving and representing sensory information (Newsome and Paré, 1988; Britten et al, 1992; Salzman et al, 1992; Romo et al, 1998; Hernández et al, 2000; Binder et al, 2004). Relatively recent studies in the field have shown that the representation of visual information in these areas, called core system, is not sufficient (Marotta, 2001; Avidan et al, 2005; Schiltz et al, 2006; Avidan and Behrmann, 2009), and further processing of visual information in the higher order cortical area, called the extended system, is crucial to discriminate visual objects (Fairhall and Ishai, 2007; Heekeren et al, 2008; Avidan and Behrmann, 2009)

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call