Subtle eye movement metrics reveal task-relevant representations prior to visual search

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

Visual search is thought to be guided by an active visual working memory (VWM) representation of the taskrelevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template. In all three experiments, participants made both more and larger eye movements towards the probe matching the target color. The bias was predominantly expressed in microsaccades, 100-250 ms after probe onset. Experiment 2 used a retro-cue technique to show that these metrics distinguish between relevant and dropped representations. Finally, Experiment 3 used a sequential task paradigm, and showed that the same metrics also distinguish between current and prospective search templates. Taken together, we show how subtle eye movements track task-relevant representations for selective attention prior to visual search.

Original languageEnglish
Article number13
Pages (from-to)1-15
Number of pages15
JournalJournal of Vision
Volume17
Issue number6
DOIs
Publication statusPublished - 2017

Fingerprint

Eye Movements
Short-Term Memory
Color
Cues

Keywords

  • Attentional capture
  • Microsaccades
  • Saccades
  • Visual search
  • Visual working memory

Cite this

@article{fb4d59edd9d94775841c7fabbcee8579,
title = "Subtle eye movement metrics reveal task-relevant representations prior to visual search",
abstract = "Visual search is thought to be guided by an active visual working memory (VWM) representation of the taskrelevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template. In all three experiments, participants made both more and larger eye movements towards the probe matching the target color. The bias was predominantly expressed in microsaccades, 100-250 ms after probe onset. Experiment 2 used a retro-cue technique to show that these metrics distinguish between relevant and dropped representations. Finally, Experiment 3 used a sequential task paradigm, and showed that the same metrics also distinguish between current and prospective search templates. Taken together, we show how subtle eye movements track task-relevant representations for selective attention prior to visual search.",
keywords = "Attentional capture, Microsaccades, Saccades, Visual search, Visual working memory",
author = "{van Loon}, {Anouk M.} and Katya Olmos-Solis and Olivers, {Christian N.L.}",
year = "2017",
doi = "10.1167/17.6.13",
language = "English",
volume = "17",
pages = "1--15",
journal = "Journal of Vision",
issn = "1534-7362",
publisher = "Association for Research in Vision and Ophthalmology Inc.",
number = "6",

}

Subtle eye movement metrics reveal task-relevant representations prior to visual search. / van Loon, Anouk M.; Olmos-Solis, Katya; Olivers, Christian N.L.

In: Journal of Vision, Vol. 17, No. 6, 13, 2017, p. 1-15.

Research output: Contribution to JournalArticleAcademicpeer-review

TY - JOUR

T1 - Subtle eye movement metrics reveal task-relevant representations prior to visual search

AU - van Loon, Anouk M.

AU - Olmos-Solis, Katya

AU - Olivers, Christian N.L.

PY - 2017

Y1 - 2017

N2 - Visual search is thought to be guided by an active visual working memory (VWM) representation of the taskrelevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template. In all three experiments, participants made both more and larger eye movements towards the probe matching the target color. The bias was predominantly expressed in microsaccades, 100-250 ms after probe onset. Experiment 2 used a retro-cue technique to show that these metrics distinguish between relevant and dropped representations. Finally, Experiment 3 used a sequential task paradigm, and showed that the same metrics also distinguish between current and prospective search templates. Taken together, we show how subtle eye movements track task-relevant representations for selective attention prior to visual search.

AB - Visual search is thought to be guided by an active visual working memory (VWM) representation of the taskrelevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template. In all three experiments, participants made both more and larger eye movements towards the probe matching the target color. The bias was predominantly expressed in microsaccades, 100-250 ms after probe onset. Experiment 2 used a retro-cue technique to show that these metrics distinguish between relevant and dropped representations. Finally, Experiment 3 used a sequential task paradigm, and showed that the same metrics also distinguish between current and prospective search templates. Taken together, we show how subtle eye movements track task-relevant representations for selective attention prior to visual search.

KW - Attentional capture

KW - Microsaccades

KW - Saccades

KW - Visual search

KW - Visual working memory

UR - http://www.scopus.com/inward/record.url?scp=85021154706&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85021154706&partnerID=8YFLogxK

U2 - 10.1167/17.6.13

DO - 10.1167/17.6.13

M3 - Article

VL - 17

SP - 1

EP - 15

JO - Journal of Vision

JF - Journal of Vision

SN - 1534-7362

IS - 6

M1 - 13

ER -