Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

We often estimate, or perceive, the quality of materials, surfaces, and objects, what the Japanese refer to as 'shitsukan', by means of several of our senses. The majority of the literature on shitsukan perception has, though, tended to focus on the unimodal visual evaluation of stimulus properties. In part, this presumably reflects the widespread hegemony of the visual in the modern era and, in part, is a result of the growing interest, not to mention the impressive advances, in digital rendering amongst the computer graphics community. Nevertheless, regardless of such an oculocentric bias in so much of the empirical literature, it is important to note that several other senses often do contribute to the impression of the material quality of surfaces, materials, and objects as experienced in the real world, rather than just in virtual reality. Understanding the multisensory contributions to the perception of material quality, especially when combined with computational and neural data, is likely to have implications for a number of fields of basic research as well as being applicable to emerging domains such as, for example, multisensory augmented retail, not to mention multisensory packaging design.

Original publication

DOI

10.1163/22134808-bja10003

Type

Journal article

Journal

Multisens Res

Publication Date

05/03/2020

Pages

1 - 39

Keywords

Shitsukan, crossmodal, incongruency, material perception, multisensory, quality perception