TY - JOUR
T1 - Subtle eye movement metrics reveal task-relevant representations prior to visual search
AU - van Loon, Anouk M.
AU - Olmos-Solis, Katya
AU - Olivers, Christian N.L.
PY - 2017
Y1 - 2017
N2 - Visual search is thought to be guided by an active visual working memory (VWM) representation of the taskrelevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template. In all three experiments, participants made both more and larger eye movements towards the probe matching the target color. The bias was predominantly expressed in microsaccades, 100-250 ms after probe onset. Experiment 2 used a retro-cue technique to show that these metrics distinguish between relevant and dropped representations. Finally, Experiment 3 used a sequential task paradigm, and showed that the same metrics also distinguish between current and prospective search templates. Taken together, we show how subtle eye movements track task-relevant representations for selective attention prior to visual search.
AB - Visual search is thought to be guided by an active visual working memory (VWM) representation of the taskrelevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template. In all three experiments, participants made both more and larger eye movements towards the probe matching the target color. The bias was predominantly expressed in microsaccades, 100-250 ms after probe onset. Experiment 2 used a retro-cue technique to show that these metrics distinguish between relevant and dropped representations. Finally, Experiment 3 used a sequential task paradigm, and showed that the same metrics also distinguish between current and prospective search templates. Taken together, we show how subtle eye movements track task-relevant representations for selective attention prior to visual search.
KW - Attentional capture
KW - Microsaccades
KW - Saccades
KW - Visual search
KW - Visual working memory
UR - http://www.scopus.com/inward/record.url?scp=85021154706&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85021154706&partnerID=8YFLogxK
U2 - 10.1167/17.6.13
DO - 10.1167/17.6.13
M3 - Article
C2 - 28637052
AN - SCOPUS:85021154706
SN - 1534-7362
VL - 17
SP - 1
EP - 15
JO - Journal of Vision
JF - Journal of Vision
IS - 6
M1 - 13
ER -