The complexity of natural environments requires highly flexible mechanisms for adaptive processing of single and multiple stimuli. Neuronal oscillations could be an ideal candidate for implementing such flexibility in neural systems. Here, we present a framework for structuring attention-guided processing of complex visual scenes in humans, based on multiplexing and phase coding schemes. Importantly, we suggest that the dynamic fluctuations of excitability vary rapidly in terms of magnitude, frequency and wave-form over time, i.e., they are not necessarily sinusoidal or sustained oscillations. Different elements of single objects would be processed within a single cycle (burst) of alpha activity (7-14 Hz), allowing for the formation of coherent object representations while separating multiple objects across multiple cycles. Each element of an object would be processed separately in time-expressed as different gamma band bursts (>30 Hz)-along the alpha phase. Since the processing capacity per alpha cycle is limited, an inverse relationship between object resolution and size of attentional spotlight ensures independence of the proposed mechanism from absolute object complexity. Frequency and wave-shape of those fluctuations would depend on the nature of the object that is processed and on cognitive demands. Multiple objects would further be organized along the phase of slower fluctuations (e.g., theta), potentially driven by saccades. Complex scene processing, involving covert attention and eye movements, would therefore be associated with multiple frequency changes in the alpha and lower frequency range. This framework embraces the idea of a hierarchical organization of visual processing, independent of environmental temporal dynamics.
Read full abstract