Hostname: page-component-77c89778f8-sh8wx Total loading time: 0 Render date: 2024-07-21T07:22:40.168Z Has data issue: false hasContentIssue false

A Methodology for Multisensory Product Experience Design Using Cross-modal Effect: A Case of SLR Camera

Published online by Cambridge University Press:  26 July 2019

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

Throughout the course of product experience, a user employs multiple senses, including vision, hearing, and touch. Previous cross-modal studies have shown that multiple senses interact with each other and change perceptions. In this paper, we propose a methodology for designing multisensory product experiences by applying cross-modal effect to simultaneous stimuli. In this methodology, we first obtain a model of the comprehensive cognitive structure of user's multisensory experience by applying Kansei modeling methodology and extract opportunities of cross-modal effect from the structure. Second, we conduct experiments on these cross-modal effects and formulate them by obtaining a regression curve through analysis. Finally, we find solutions to improve the product sensory experience from the regression model of the target cross-modal effects. We demonstrated the validity of the methodology with SLR cameras as a case study, which is a typical product with multisensory perceptions.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
© The Author(s) 2019

References

Altinsoy, M.E. and Merchel, S. (2010), “Cross-modal frequency matching: Sound and whole-body vibration”, Lecture Notes in Computer Science, (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), https://doi.org/10.1007/978-3-642-15841-4_5.Google Scholar
Charpentier, A. (1891), “Analyse experimentale: De quelques elements de la sensation de poids. [Experimental analysis: On some of the elements of sensations of weight]”, Archives de Physiologie Normale et Pathologique, Vol. 3, pp. 122135.Google Scholar
Deliza, R. and MacFie, H.J.H. (1996), “The generation of sensory expectation by external cues and its effect on sensory perception and hedonic ratings: A review”, Journal of Sensory Studies, Vol. 11 No. 2, pp. 103128.Google Scholar
Driver, J. and Spence, C. (1998), “Cross-modal links in spatial attention”, Philosophical Transactions of the Royal Society of London, Series B, Biological Sciences, Vol. 353 No. 1373, pp. 13191331. https://doi.org/10.1098/rstb.1998.0286wiGoogle Scholar
Gescheider, G.A. and Niblette, R.K. (1967), “Cross-modality masking for touch and hearing”, Journal of Experimental Psychology, Vol. 74 No. 3, pp. 313320. https://doi.org/10.1037/h0024700Google Scholar
Klatzky, R.L., Lederman, S.J. and Matula, D.E. (1993), “Haptic exploration in the presence of vision”, Journal of Experimental Psychology: Human Perception and Performance, Vol. 19 No. 4, pp. 726743.Google Scholar
Lederman, S.J., Thorne, G. and Jones, B. (1986), “Perception of texture by vision and touch: Multidimensionality and intersensory integration”, Journal of Experimental Psychology: Human Perception and Performance, Vol. 12 No. 2, pp. 169180.Google Scholar
Ludden, G., Schifferstein, H. and Hekkert, P. (2009), “Visual-tactual incongruities in products as sources of surprise”, Empirical Studies of the Arts, Vol. 27 No. 1, pp. 6187.Google Scholar
Schifferstein, H.N.J., Fenko, A., Desmet, P.M.A., Labbe, D. and Martin, N. (2013), “Influence of package design on the dynamics of multisensory and emotional food experience”, Food Quality and Preference, Vol. 27 No. 1, pp. 1825. https://doi.org/10.1016/j.foodqual.2012.06.003Google Scholar
Spence, C., Nicholls, M.E.R., Gillespie, N. and Driver, J. (1998), “Cross-modal links in exogenous covert spatial orienting between touch, audition, and vision”, Perception and Psychophysics, Vol. 60 No. 4, pp. 544557. https://doi.org/10.3758/BF03206045Google Scholar
Verrillo, R.T. (1971), “Vibrotactile thresholds measured at the finger”, Perception & Psychophysics, Vol. 9 No. 4, pp. 329330.Google Scholar
Vroomen, J. and De Gelder, B. (2000), “Sound enhances visual perception: Cross-modal effects of auditory organization on vision”, Journal of Experimental Psychology: Human Perception and Performance, Vol. 26 No. 5, pp. 15831590. https://doi.org/10.1037/0096-1523.26.5.1583Google Scholar
Wilson, E.C., Reed, C.M. and Braida, L.D. (2010), “Integration of auditory and vibrotactile stimuli: Effects of frequency”, The Journal of the Acoustical Society of America, Vol. 127 No. 5, pp. 19601974. https://doi.org/10.1121/1.3365318Google Scholar
Yanagisawa, H., Miyazaki, C. and Bouchard, C. (2017), “Kansei modeling methodology for multisensory UX design”, International Conference on Engineering Design (ICED17), Vancouver, Canada, 21–25 August 2017, Vol. 8: Human Behaviour in Design, pp. 159168.Google Scholar
Yanagisawa, H., Miyazaki, C. and Nakano, S. (2016), “Kansei modeling for multimodal user experience (visual expectation effect on product sound perception)”, Proceedings of the INTER-NOISE 2016 - 45th International Congress and Exposition on Noise Control Engineering: Towards a Quieter Future, pp. 17931802.Google Scholar
Yanagisawa, H. and Takatsuji, K. (2015), “Effects of visual expectation on perceived tactile perception: An evaluation method of surface texture with expectation effect”, International Journal of Design, Vol. 9 No. 1, pp. 3951.Google Scholar