Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.

The sounds that are elicited when we touch or use many everyday objects typically convey potentially useful information regarding the nature of the stimuli with which we are interacting. Here we review the rapidly-growing literature demonstrating the influence of auditory cues (such as overall sound level and the spectral distribution of the sounds) on multisensory product perception. The results of a number of studies now show that the modulation of the auditory cues elicited by our contact or interaction with different surfaces (such as abrasive sandpapers or even our own skin) and products (including electric toothbrushes, aerosol sprays, food mixers, and cars) can dramatically change the way in which they are perceived, despite the fact that we are often unaware of the influence of such auditory cues on our perception. The auditory cues generated by products can also be modified in order to change people's perception of the quality/efficiency of those products. The principles of sound design have also been used recently to alter people's perception of a variety of foodstuffs. Findings such as these demonstrate the automatic and obligatory nature of multisensory integration, and show how the cues available in one sensory modality can modulate people's perception of stimuli in other sensory modalities (despite the fact that they may not be aware of the importance of such crossmodal influences). We also highlight evidence showing that auditory cues can influence product perception at a more semantic level, as demonstrated by research on signature sounds and emotional product sound design. © S. Hirzel Verlag EAA.


Journal article


Acta Acustica united with Acustica

Publication Date





1009 - 1025