Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Past research shows that crossmodal semantic congruence plays a role in the orienting of spatial attention and visual search. However, the extent to which crossmodal semantic relationships summon attention automatically or necessitate some form of top-down modulation is still not entirely clear. To date, researchers have used varied methodologies and their outcomes have been inconsistent. Variations in the task-relevance of the crossmodal stimulus (ranging from explicitly needed to entirely task-irrelevant), the amount of perceptual complexity, and the response modality, may all help to account for the mixed results of previous experiments. In the present study, we address the effects of audiovisual semantic congruence on spatial attention across variations in task relevance and perceptual load. Participants were presented with an array of visual images of common objects, paired with sounds that could correspond with one of those objects (e.g., guitar image and chord sound). Under conditions of relatively low perceptual load, crossmodal semantic congruence was found to speed up visual search times regardless of the task relevance of the multisensory stimulus (Experiment 1 and 2). However, under higher perceptual load, task-irrelevant audiovisual semantic congruence failed to attract spatial attention. These results support the conclusion that crossmodal semantic congruence does not attract attention fully automatically and draws from top-down processes.

More information Original publication

DOI

10.20350/digitalCSIC/15662

Type

Journal article

Publication Date

2023-01-01T00:00:00+00:00

Volume

44