Logo Logo
Hilfe
Hilfe
Switch Language to English

Chen, Siyi; Geyer, Thomas; Zinchenko, Artyom; Müller, Hermann J. und Shi, Zhuanghua (2022): Multisensory Rather than Unisensory Representations Contribute to Statistical Context Learning in Tactile Search. In: Journal of Cognitive Neuroscience, Bd. 34, Nr. 9: S. 1702-1717

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

Using a combination of behavioral and EEG measures in a tactile odd-one-out search task with collocated visual items, we investigated the mechanisms underlying facilitation of search by repeated (vs. nonrepeated) spatial distractor-target configurations (contextual cueing) when either the tactile (same-modality) or the visual array (different-modality) context was predictive of the location of the tactile singleton target. Importantly, in both conditions, the stimulation was multisensory, consisting of tactile plus visual items, although the target was singled out in the tactile modality and so the visual items were task-irrelevant. We found that when the predictive context was tactile, facilitation of search RTs by repeated configurations was accompanied by, and correlated with, enhanced lateralized ERP markers of pre-attentive (N1, N2) and, respectively focal-attentional processing (contralateral delay activity) not only over central (somatosensory), but also posterior (visual) electrode sites, although the ERP effects were less marked over visual cortex. A similar pattern-of facilitated RTs and enhanced lateralized (N2 and contralateral delay activity) ERP components-was found when the predictive context was visual, although the ERP effects were less marked over somatosensory cortex. These findings indicate that both somatosensory and visual cortical regions contribute to the more efficient processing of the tactile target in repeated stimulus arrays, although their involvement is differentially weighted depending on the sensory modality that contains the predictive information.

Dokument bearbeiten Dokument bearbeiten