Abstract

The visual system uses two complimentary strategies to process multiple objects simultaneously within a scene and update their spatial positions in real time. It either uses selective attention to individuate a complex, dynamic scene into a few focal objects (i.e., object individuation), or it represents multiple objects as an ensemble by distributing attention more globally across the scene (i.e., ensemble grouping). Neural oscillations may be a key signature for focal object individuation versus distributed ensemble grouping, because they are thought to regulate neural excitability over visual areas through inhibitory control mechanisms. We recorded whole-head MEG data during a multiple-object tracking paradigm, in which human participants (13 female, 11 male) switched between different instructions for object individuation and ensemble grouping on different trials. The stimuli, responses, and the demand to keep track of multiple spatial locations over time were held constant between the two conditions. We observed increased α-band power (9-13 Hz) packed into oscillatory bursts in bilateral inferior parietal cortex during multiple-object processing. Single-trial analysis revealed greater burst occurrences on object individuation versus ensemble grouping trials. By contrast, we found no differences using standard analyses on across-trials averaged α-band power. Moreover, the bursting effects occurred only below/at, but not above, the typical capacity limits for multiple-object processing (at ∼4 objects). Our findings reveal the real-time neural correlates underlying the dynamic processing of multiple-object scenarios, which are modulated by grouping strategies and capacity. They support a rhythmic, α-pulsed organization of dynamic attention to multiple objects and ensembles.SIGNIFICANCE STATEMENT Dynamic multiple-object scenarios are an important problem in real-world and computer vision. They require keeping track of multiple objects as they move through space and time. Such problems can be solved in two ways: One can individuate a scene object by object, or alternatively group objects into ensembles. We observed greater occurrences of α-oscillatory burst events in parietal cortex for processing objects versus ensembles and below/at versus above processing capacity. These results demonstrate a unique top-down mechanism by which the brain dynamically adjusts its computational level between objects and ensembles. They help to explain how the brain copes with its capacity limitations in real-time environments and may lead the way to technological innovations for time-critical video analysis in computer vision.

Highlights

  • Dynamic perceptual experiences require the visual system to process multiple objects simultaneously within a scene and update them from moment to moment

  • An ideal task to investigate this ability in experimental settings is multiple-object tracking (MOT) (Pylyshyn and Storm, 1988)

  • We focused on single-trial activity changes in neural oscillations (Lundqvist et al, 2016; for review, see van Ede et al, 2018), because they account for relevant trial-by-trial variability in neural timing expected in time-critical tasks, such as MOT

Read more

Summary

Introduction

Dynamic perceptual experiences require the visual system to process multiple objects simultaneously within a scene and update them from moment to moment. Single-trial baseline values were Gradiometer topographies for the power increase relative to pretrial baseline epochs with a random-effects approach compared with each time–frequency bin and time (dependent-samples t test over participants) averaged between 9 and 13 Hz and between 2 and 4 s.

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call