{
    "items": [
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/355497\" title=\"The influence of the feel of product packaging on the perception of the oral-somatosensory texture of food\" class=\"state-synced\">The influence of the feel of product packaging on the perception of the oral-somatosensory texture of food</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">Most of the published research on the perception of food texture has focused on what happens in-mouth during consumption. It is, however, important to note that people's judgments of food texture can also be influenced by other sensory cues, such as haptic input, be it their direct contact with the food, or possibly also their indirect contact with the product packaging as well. The aim of the present study was to investigate whether changing the surface texture of the product packaging would affect people's perception of the product contained within-that is, we wanted to know whether the feel of the packaging held in a consumer's hand would influence the perceived texture of the food. Participants tasted biscuits and yoghurt samples from pots (yoghurt containers) that varied in terms of their surface texture (rough/granular vs. smooth). Additionally, the foodstuffs also varied in terms of their texture (crunchiness and thickness, respectively). In a 2. \u00d7. 2 experimental design, the participants assessed the texture of the foodstuff and their liking for it while holding the pot in their non-dominant hand. The results revealed that the texture of the container influenced participants' ratings of certain of the texture attributes being assessed, namely the most related ones. These findings shed light on the importance of nondiagnostic haptic cues (defined as those that objectively should not identify or prompt any effect) in the perception of food. These results, explained in terms of sensation transference, could have important implications for the food packaging and hospitality sectors. \u00a9 2012 Elsevier Ltd.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/198994\" title=\"Do The Material Properties Of Cutlery Affect The Perception Of The Food You Eat? An Exploratory Study\" class=\"state-synced\">Do The Material Properties Of Cutlery Affect The Perception Of The Food You Eat? An Exploratory Study</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">Our perception of food depends both on the contextual conditions in which it is presented and on the way the food is consumed; hence, the current trend toward investigating these contextual variables in empirical research. In most meal situations, people interact with the food that they consume by means of a variety of accessories, such as dishes and cutlery that fulfill the consumers' functional requirements. However, despite the important role that these accessory stimuli play during the consumption of food, little research has been carried out in order to investigate how they may affect food perception. The present preliminary study was therefore undertaken in order to explore how the cutlery, specifically two teaspoons (a plastic one with metallic finish and a stainless steel one), affected consumers' sensory and hedonic perception of yoghurt. Consumers' quality and liking judgments concerning identical yoghurt samples differed significantly when tasted either with a metallic plastic spoon or else with a stainless steel spoon, the latter resulting in significantly higher scores. PRACTICAL APPLICATIONS: These results contribute to our knowledge of how contextual variables affect a consumer's perception of food. These findings open up new research directions emphasizing the potential importance of the properties of cutlery and its influence on consumers' perception of food. These results are relevant to product development and to caterers and restaurateurs because different dishes (i.e., foodstuffs/flavors) could be matched with different types of cutlery in order to increase convenience and, at the same, time potentially enhance the consumers' eating experience. \u00a9 2011 Wiley Periodicals, Inc.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/198729\" title=\"Crossmodal semantic priming by naturalistic sounds and spoken words enhances visual sensitivity.\" class=\"state-synced\">Crossmodal semantic priming by naturalistic sounds and spoken words enhances visual sensitivity.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">We propose a multisensory framework based on Glaser and Glaser's (1989) general reading-naming interference model to account for the semantic priming effect by naturalistic sounds and spoken words on visual picture sensitivity. Four experiments were designed to investigate two key issues: First, can auditory stimuli enhance visual sensitivity when the sound leads the picture as well as when they are presented simultaneously? And, second, do naturalistic sounds (e.g., a dog's \"woofing\") and spoken words (e.g., /d\u0254g/) elicit similar semantic priming effects? Here, we estimated participants' sensitivity and response criterion using signal detection theory in a picture detection task. The results demonstrate that naturalistic sounds enhanced visual sensitivity when the onset of the sounds led that of the picture by 346 ms (but not when the sounds led the pictures by 173 ms, nor when they were presented simultaneously, Experiments 1-3A). At the same SOA, however, spoken words did not induce semantic priming effects on visual detection sensitivity (Experiments 3B and 4A). When using a dual picture detection/identification task, both kinds of auditory stimulus induced a similar semantic priming effect (Experiment 4B). Therefore, we suggest that there needs to be sufficient processing time for the auditory stimulus to access its associated meaning to modulate visual perception. Besides, the interactions between pictures and the two types of sounds depend not only on their processing route to access semantic representations, but also on the response to be made to fulfill the requirements of the task.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/175215\" title=\"Reversing the Colavita visual dominance effect.\" class=\"state-synced\">Reversing the Colavita visual dominance effect.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">Many researchers have taken the Colavita effect to represent a paradigm case of visual dominance. Broadly defined, the effect occurs when people fail to respond to an auditory target if they also have to respond to a visual target presented at the same time. Previous studies have revealed the remarkable resilience of this effect to various manipulations. In fact, a reversal of the Colavita visual dominance effect (i.e., auditory dominance) has never been reported. Here, we present a series of experiments designed to investigate whether it is possible to reverse the Colavita effect when the target stimuli consist of repetitions embedded in simultaneously presented auditory and visual streams of stimuli. In line with previous findings, the Colavita effect was still observed for an immediate repetition task, but when an n-1 repetition detection task was used, a reversal of visual dominance was demonstrated. These results suggest that masking from intervening stimuli between n-1 repetition targets was responsible for the elimination and reversal of the Colavita visual dominance effect. They further suggest that varying the presence of a mask (pattern, conceptual, or absent) in the repetition detection task gives rise to different patterns of sensory dominance (i.e., visual dominance, an elimination of the Colavita effect, or even auditory dominance).</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/172103\" title=\"Attention and suppression affect tactile perception in reach-to-grasp movements.\" class=\"state-synced\">Attention and suppression affect tactile perception in reach-to-grasp movements.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">Reaching with the hand is characterized by a decrease in sensitivity to tactile stimuli presented to the moving hand. Here, we investigated whether tactile suppression can be canceled by attentional orienting. In a first experiment, participants performed a dual-task involving a goal-directed movement paired with the speeded detection of a tactile pulse. The pulse was either delivered to the moving or stationary hand, during movement preparation, execution, or the post-movement phase. Furthermore, stimulation was delivered with equal probability to either hand, or with a higher probability to either the moving or resting hand. The results highlighted faster RTs under conditions of higher probability of stimulation delivery to both moving and resting hands, thus indicating an attentional effect. For the motor preparation period, RTs were faster only at the resting hand under conditions where tactile stimulation was more likely to be delivered there. In a second experiment, a non-speeded perceptual task was used as a secondary task and tactile discrimination thresholds were recorded. Tactile stimulation was delivered concomitantly at both index fingers either in the movement preparation period (both before and after the selection of the movement effector had taken place), in the motor execution period, or, in a control condition, in the time-window of motor execution, but the movement of the hand was restrained. In the preparation period, tactile thresholds were comparable for the two timings of stimulation delivery; i.e., before and after the selection of the movement effector had taken place. These results therefore suggest that shortly prior to, and during, the execution of goal-directed movements, a combined facilitatory and inhibitory influence acts on tactile perception.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/90781\" title=\"As bitter as a trombone: synesthetic correspondences in nonsynesthetes between tastes/flavors and musical notes.\" class=\"state-synced\">As bitter as a trombone: synesthetic correspondences in nonsynesthetes between tastes/flavors and musical notes.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">In parallel to studies of various cases of synesthesia, many cross-modal correspondences have also been documented in nonsynesthetes. Among these correspondences, implicit associations between taste and pitch have been reported recently (Crisinel &amp; Spence, 2009, 2010). Here, we replicate and extend these findings through explicit matching of sounds of varying pitch to a range of tastes/flavors. In addition, participants in the experiment reported here also chose the type of musical instrument most appropriate for each taste/flavor. The association of sweet and sour tastes to high-pitched notes was confirmed. By contrast, umami and bitter tastes were preferentially matched to low-pitched notes. Flavors did not display such strong pitch associations. The choice of musical instrument seems to have been driven primarily by a matching of the hedonic value and familiarity of the two types of stimuli. Our results raise important questions about our representation of tastes and flavors and could also lead to applications in the marketing of food products.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/90780\" title=\"An expectations-based approach to explaining the cross-modal influence of color on orthonasal olfactory identification: the influence of the degree of discrepancy.\" class=\"state-synced\">An expectations-based approach to explaining the cross-modal influence of color on orthonasal olfactory identification: the influence of the degree of discrepancy.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">In the present study, we explored the conditions under which color-generated expectations influence participants' identification of flavored drinks. Four experiments were conducted in which the degree of discrepancy between the expected identity of a flavor (derived from the color of a drink) and the actual identity of the flavor (derived from orthonasal olfactory cues) was examined. Using a novel experimental approach that controlled for individual differences in color-flavor associations, we first measured the flavor expectations held by each individual and only then examined whether the same individual's identification responses were influenced by his or her own expectations. Under conditions of low discrepancy, the perceived disparity between the expected and the actual flavor identities was small. When a particular color--identified by participants as one that generated a strong flavor expectation--was added to these drinks (as compared with when no such color was added), a significantly greater proportion of identification responses were consistent with this expectation. This held true even when participants were explicitly told that color would be an uninformative cue and were given as much time as desired to complete the task. By contrast, under conditions of high discrepancy, adding the same colors to the drinks no longer had the same effect on participants' identification responses. Critically, there was a significant difference in the proportion of responses that were consistent with participants' color-based expectations in conditions of low as compared with high discrepancy, indicating that the degree of discrepancy between an individual's actual and expected experience can significantly affect the extent to which color influences judgments of flavor identity.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/90779\" title=\"Crossmodal facilitation of masked visual target identification.\" class=\"state-synced\">Crossmodal facilitation of masked visual target identification.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">In the present study, participants identified the location of a visual target presented in a rapidly masked, changing sequence of visual distractors. In Experiment 1, we examined performance when a high tone, embedded in a sequence of low tones, was presented in synchrony with the visual target and observed that the high tone improved visual target identification, relative to a condition in which a low tone was synchronized with the visual target, thus replicating Vroomen and de Gelder's (2000, Experiment 1) findings. In subsequent experiments, we presented a single visual, auditory, vibrotactile, or combined audiotactile cue with the visual target and found similar improvements in participants' performance regardless of cue type. These results suggest that crossmodal perceptual organization may account for only a part of the improvement in participants' visual target identification performance reported in Vroomen and de Gelder's original study. Moreover, in contrast with many previous crossmodal cuing studies, our results also suggest that visual cues can enhance visual target identification performance. Alternative accounts for these results are discussed in terms of enhanced saliency, the presence of a temporal marker, and attentional capture by oddball stimuli as potential explanations for the observed performance benefits.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/22916\" title=\"Citric Acid Modulates Discrimination of Sweetness Intensity in Sucrose Solutions\" class=\"state-synced\">Citric Acid Modulates Discrimination of Sweetness Intensity in Sucrose Solutions</a>\n            </h4>\n            \n            \n            \n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/21672\" title=\"The Influence of Color and Label Information on Perceptions of Chocolate\" class=\"state-synced\">The Influence of Color and Label Information on Perceptions of Chocolate</a>\n            </h4>\n            \n            \n            \n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/8922\" title=\"The multisensory perception of flavor: Assessing the influence of color cues on flavor discrimination responses\" class=\"state-synced\">The multisensory perception of flavor: Assessing the influence of color cues on flavor discrimination responses</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">Two experiments are reported that were designed to investigate the influence of visual color cues on people's flavor discrimination and flavor intensity ratings for a variety of fruit-flavored solutions. In Experiment 1, the participants had to associate specific flavors with solutions of various colors simply by looking at them (i.e., without tasting them). In Experiment 2, the participants tasted the solutions and had to discriminate the flavor of solutions that had been colored either 'appropriately' or 'inappropriately', or else presented as colorless solutions. The participants were explicitly informed that the colors of the solutions provided no useful information regarding the actual flavor identity of the solutions. The participants also rated the flavor intensity of the solutions. The accuracy of participants' flavor discrimination performance was significantly lower when the solutions were colored inappropriately than when they were colored appropriately (or else were presented as colorless solutions). These results show that the modulatory effect of visual cues on flavor perception can override participants' awareness that the solutions would frequently be colored inappropriately. \u00a9 2007 Elsevier Ltd. All rights reserved.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/19212\" title=\"Crossmodal change blindness between vision and touch.\" class=\"state-synced\">Crossmodal change blindness between vision and touch.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">Change blindness is the name given to people's inability to detect changes introduced between two consecutively-presented scenes when they are separated by a distractor that masks the transients that are typically associated with change. Change blindness has been reported within vision, audition, and touch, but has never before been investigated when successive patterns are presented to different sensory modalities. In the study reported here, we investigated change detection performance when the two to-be-compared stimulus patterns were presented in the same sensory modality (i.e., both visual or both tactile) and when one stimulus pattern was tactile while the other was presented visually or vice versa. The two to-be-compared patterns were presented consecutively, separated by an empty interval, or else separated by a masked interval. In the latter case, the masked interval could either be tactile or visual. The first experiment investigated visual-tactile and tactile-visual change detection performance. The results showed that in the absence of masking, participants detected changes in position accurately, despite the fact that the two to-be-compared displays were presented in different sensory modalities. Furthermore, when a mask was presented between the two to-be-compared displays, crossmodal change blindness was elicited no matter whether the mask was visual or tactile. The results of two further experiments showed that performance was better overall in the unimodal (visual or tactile) conditions than in the crossmodal conditions. These results suggest that certain of the processes underlying change blindness are multisensory in nature. We discuss these findings in relation to recent claims regarding the crossmodal nature of spatial attention.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/31216\" title=\"Crossmodal associations and interactions between olfaction and vision\" class=\"state-synced\">Crossmodal associations and interactions between olfaction and vision</a>\n            </h4>\n            \n            \n            \n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/20478\" title=\"The modulation of crossmodal integration by unimodal perceptual grouping: a visuotactile apparent motion study.\" class=\"state-synced\">The modulation of crossmodal integration by unimodal perceptual grouping: a visuotactile apparent motion study.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">We adapted the crossmodal dynamic capture task to investigate the modulation of visuotactile crossmodal integration by unimodal visual perceptual grouping. The influence of finger posture on this interaction was also explored. Participants were required to judge the direction of a tactile apparent motion stream (moving either to the left or to the right) presented to their crossed or uncrossed index fingers. The participants were instructed to ignore a distracting visual apparent motion stream, comprised of either 2 or 6 lights presented concurrently with the tactile stimuli. More crossmodal dynamic capture of the direction of the tactile apparent motion stream by the visual apparent motion stream was observed in the 2-lights condition than in the 6-lights condition. This interaction was not modulated by finger posture. These results suggest that visual intramodal perceptual grouping constrains the crossmodal binding of visual and tactile apparent motion information, irrespective of finger posture.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/30349\" title=\"Multisensory synesthetic interactions in the speeded classification of visual size.\" class=\"state-synced\">Multisensory synesthetic interactions in the speeded classification of visual size.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">In the present study, we attempted to demonstrate a synesthetic relationship between auditory frequency and visual size. In Experiment 1, participants performed a speeded visual size discrimination task in which they had to judge whether a variable-sized disk was bigger or smaller than a standard reference disk. A task-irrelevant sound that was either synesthetically congruent with the relative size of the disk (e.g., a low-frequency sound presented with a bigger disk) or synesthetically incongruent with it (e.g., a low-frequency sound presented with a smaller disk) was sometimes presented together with the variable disk. Reaction times were shorter in the synesthetically congruent condition than in the incongruent condition. Verbal labeling and semantic mediation interpretations of this interaction were explored in Experiment 2, in which high- and low-frequency sounds were presented in separate blocks of trials, and in Experiment 3, in which the tones were replaced by the spoken words \"high\" and \"low.\" Response priming/bias explanations were ruled out in Experiment 4, in which a synesthetic congruency effect was still reported even when participants made same-versus-different discrimination responses regarding the relative sizes of the two disks. Taken together, these results provide the first empirical demonstration that the relative frequency of an irrelevant sound can influence the speed with which participants judge the size of visual stimuli when the sound varies on a trial-by-trial basis along a synesthetically compatible dimension. The possible cognitive bases for this synesthetic association are also discussed.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/28628\" title=\"Effect of posture change on tactile perception: impaired direction discrimination performance with interleaved fingers.\" class=\"state-synced\">Effect of posture change on tactile perception: impaired direction discrimination performance with interleaved fingers.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">We report a series of experiments in which participants had to judge the direction in which a pair of vibrotactile stimuli presented to two adjacent digits of either the same or different hands were stimulated (left-to-right or vice versa in experiments 1 and 2; near-to-far or vice versa in experiment 3, at stimulus onset asynchronies varying between 100 and 600 ms). When the participant's hands were placed side-by-side (anatomical posture), with their fingers either pointing away from them or else pointing toward the midline, directional discrimination performance was generally accurate. By contrast, when the fingers of the two hands were interleaved in either of these postures, performance deteriorated significantly for certain specific combinations of digits, with a more pronounced impairment seen when the fingers pointed away from the participant than when they pointed toward the midline. This decline in tactile direction discrimination performance in the interleaved fingers posture appears to reflect a failure to represent the position of tactile stimuli correctly when the fingers of the two hands are interleaved.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/1193\" title=\"Modifying the multisensory perception of a carbonated beverage using auditory cues\" class=\"state-synced\">Modifying the multisensory perception of a carbonated beverage using auditory cues</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">Our perception of the level of carbonation in a beverage often relies on the integration of a variety of multisensory cues from vision, oral-somatosensation, nociception, audition, and possibly even manual touch. In the present study, we specifically investigated the role of auditory cues in the perception of carbonation in beverages. In Experiment 1, participants rated a series of sparkling water samples in terms of their perceived carbonation using a visual analogue scale. The water sounds were modified by changing the loudness and/or frequency composition of the auditory feedback emitted by the water samples. The carbonated water samples were judged to be more carbonated when the overall sound level was increased and/or when the high frequency components (2-20 kHz) of the water sound were amplified. In Experiment 2, we investigated whether the perception of carbonation of physically different water samples (still, semi-sparkling, and sparkling) varied as a function of the distance at which the beverage was held from the participant. The sparkling water samples were evaluated as being more carbonated when they were held close to the ear rather than further away. In Experiment 3, we investigated whether the auditory manipulations reported in Experiment 1 might also influence the perception of the level of carbonation and oral irritation of water samples in the mouth. The results revealed that neither perceived carbonation nor the perceived oral irritation were influenced by variations in the level of auditory feedback. Taken together, these results highlight the significant role that auditory cues play in modulating our perception of the carbonation of beverages in the hand, and the dominance of oral-somatosensory and nociceptive cues over auditory cues in the perception of carbonation of beverages in the mouth. \u00a9 2005 Elsevier Ltd. All rights reserved.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/23919\" title=\"Audiotactile interactions in near and far space.\" class=\"state-synced\">Audiotactile interactions in near and far space.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">In this study we investigated audiotactile spatial interactions in the region behind the head. In experiment 1, participants made unspeeded temporal order judgments (TOJs) regarding pairs of auditory and tactile stimuli presented at varying stimulus onset asynchronies (SOAs) using the method of constant stimuli. Electrocutaneous stimuli were presented to the left or right earlobe while auditory stimuli were presented from just behind the participant's head on either the same or opposite side. Participants responded significantly more accurately when the stimuli were presented from different sides rather than from the same side. In experiment 2, we used a distractor interference task to show that speeded left/right discrimination responses to electrocutaneous targets were also modulated by the spatial congruency of auditory distractors presented behind the head. Performance was worse (i.e. response latencies were slower and error rates higher) when the auditory distractors were presented on the opposite side to the electrocutaneous target than when they were presented on the same side. This crossmodal distractor interference effect was larger when white noise distractors were presented from close to the head (20 cm) than when they were presented far from the head (70 cm). By contrast, pure tone distractors elicited a smaller crossmodal distractor interference effect overall, and showed no modulation as a function of distance. Taken together, these results suggest that the spatial modulation of audiotactile interactions occurs predominantly for complex auditory stimuli (for example, white noise) originating from the region close to the back of the head.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/7291\" title=\"Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration.\" class=\"state-synced\">Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">We examined whether monitoring asynchronous audiovisual speech induces a general temporal recalibration of auditory and visual sensory processing. Participants monitored a videotape featuring a speaker pronouncing a list of words (Experiments 1 and 3) or a hand playing a musical pattern on a piano (Experiment 2). The auditory and visual channels were either presented in synchrony, or else asynchronously (with the visual signal leading the auditory signal by 300 ms; Experiments 1 and 2). While performing the monitoring task, participants were asked to judge the temporal order of pairs of auditory (white noise bursts) and visual stimuli (flashes) that were presented at varying stimulus onset asynchronies (SOAs) during the session. The results showed that, while monitoring desynchronized speech or music, participants required a longer interval between the auditory and visual stimuli in order to perceive their temporal order correctly, suggesting a widening of the temporal window for audiovisual integration. The fact that no such recalibration occurred when we used a longer asynchrony (1000 ms) that exceeded the temporal window for audiovisual integration (Experiment 3) supports this conclusion.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n", 
        "\n\n    <div class=\"listing-item listing-item-search\" itemscope itemprop=\"itemListElement\" itemtype=\"http://schema.org/ListItem\">\n        \n        <div class=\"media-body\">\n        \n            <h4 class=\"media-heading\">\n                <a href=\"https://www.psy.ox.ac.uk/publications/13898\" title=\"Visual bias of unseen hand position with a mirror: spatial and temporal factors.\" class=\"state-synced\">Visual bias of unseen hand position with a mirror: spatial and temporal factors.</a>\n            </h4>\n            \n            \n            \n            \n                <p data-truncate=\"yes\" data-truncate-lines=\"2\">Two experiments examined the integration of visual and proprioceptive information concerning the location of an unseen hand, using a mirror positioned along the midsagittal plane. In experiment 1, participants tapped the fingers of both hands in synchrony, while viewing the mirror-reflection of their left hand. After 6 s, participants made reaching movements to a target with their unseen right hand behind the mirror. Reaches were accurate when visually and proprioceptively specified hand positions were congruent prior to the reach, but significantly biased by vision when the visual location conflicted with the real location. This effect was independent of the target location and depended strongly upon the relative position of the mirror-reflected hand. In experiment 2, participants made reaching movements following 4, 8, or 12 s active visuomotor or passive visual exposure to the mirror, or following passive exposure without the mirror. Reaching was biased more by the visual location following active visuomotor compared to passive visual exposure, and this bias increased with the duration of visual exposure. These results suggest that the felt position of the hand depends upon an integrated, weighted sum of visual and proprioceptive information. Visual information is weighted more strongly under active visuomotor than passive visual exposure, and with increasing exposure duration to the mirror reflected hand.</p>\n            \n\n            \n                                \n        </div>\n        \n    </div>\n\n\n"
    ], 
    "more": "\n\n    \n        <a href=\"https://www.psy.ox.ac.uk/search?random=ff9757c4-7193-4882-b2e9-00d7b0dcef18&amp;b_start:int=10000&amp;9244f192-6fbc-11ef-b1db-673e2cb9fc90=&amp;format=json\" title=\"Load more\" class=\"btn btn-default load-more-button\">\n            Load More\n        </a>\n    \n\n", 
    "msg": ""
}