To encode allocentric space information of a viewing object, it is important to relate perceptual information in the first-person perspective to the representation of an entire scene which would be constructed before. A substantial number of studies investigated the constructed scene information (e.g., cognitive map). However, only few studies have focused on its influence on perceptual processing. Therefore, we designed a visually guided saccade task requiring monkeys to gaze at objects in different locations on different backgrounds clipped from large self-designed mosaic pictures (parental pictures). In each trial, we presented moving backgrounds prior to object presentations, indicating a frame position of the background image on a parental picture. We recorded single-unit activities from 377 neurons in the posterior inferotemporal (PIT) cortex of two macaques. Equivalent numbers of neurons showed space-related (119 of 377) and object-related (125 of 377) information. The space-related neurons coded the gaze locations and background images jointly rather than separately. These results suggest that PIT neurons represent a particular location within a particular background image. Interestingly, frame positions of background images on parental pictures modulated the space-related responses dependently on parental pictures. As the frame positions could be acquired by only preceding visual experiences, the present results may provide neuronal evidence of a mnemonic effect on current perception, which might represent allocentric object location in a scene beyond the current view.
Read full abstract