Subtle eye movement metrics reveal task-relevant representations prior to visual search

Anouk M van Loon, Katya Olmos-Solis, Christian NL Olivers

    Research output: Contribution to journalArticleAcademicpeer-review

    30 Citations (Scopus)

    Abstract

    Visual search is thought to be guided by an active visual working memory (VWM) representation of the taskrelevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template. In all three experiments, participants made both more and larger eye movements towards the probe matching the target color. The bias was predominantly expressed in microsaccades, 100-250 ms after probe onset. Experiment 2 used a retro-cue technique to show that these metrics distinguish between relevant and dropped representations. Finally, Experiment 3 used a sequential task paradigm, and showed that the same metrics also distinguish between current and prospective search templates. Taken together, we show how subtle eye movements track task-relevant representations for selective attention prior to visual search.

    Original languageEnglish
    Article number13
    Pages (from-to)13-13
    Number of pages1
    JournalJournal of Vision
    Volume17
    Issue number6
    DOIs
    Publication statusPublished - 2017

    Keywords

    • Attentional capture
    • Microsaccades
    • Saccades
    • Visual search
    • Visual working memory

    Cite this