Human attention-guided visual perception is governed by rhythmic oscillations and aperiodic timescales

Abstract

Attention samples visual space sequentially to enhance behaviorally relevant sensory representations. While traditionally conceptualized as a static continuous spotlight, contemporary models of attention highlight its discrete nature. But which neural mechanisms govern the temporally precise allocation of attention? Periodic brain activity as exemplified by neuronal oscillations as well as aperiodic temporal struc- ture in the form of intrinsic neural timescales have been proposed to orchestrate the attentional sampling process in space and time. However, both mechanisms have been largely studied in isolation. To date, it remains unclear whether periodic and aperiodic temporal structure reflect distinct neural mechanisms. Here, we com- bined computational simulations with a multimodal approach encompassing five experiments, and three different variants of classic spatial attention paradigms, to differentiate aperiodic from oscillatory-based sampling. Converging evidence across behavior as well as scalp and intracranial electroencephalography (EEG) revealed that periodic and aperiodic temporal regularities can theoretically and experimentally be distinguished. Our results extend the rhythmic sampling framework of attention by demonstrating that aperiodic neural timescales predict behavior in a spatially-, ­ context-, and demand-dependent manner. Aperiodic timescales increased from sen- sory to association cortex, decreased during sensory processing or action execution, and were prolonged with increasing behavioral demands. These results reveal that multiple, concurrent temporal regularities govern attentional sampling.

Publication
PLoS Biology
Randolph Helfrich
Randolph Helfrich
Principal Investigator