Hostname: page-component-cd9895bd7-dk4vv Total loading time: 0 Render date: 2024-12-24T20:44:33.389Z Has data issue: false hasContentIssue false

Sonification of Emotion: Strategies and results from the intersection with music

Published online by Cambridge University Press:  26 February 2014

R. Michael Winters*
Affiliation:
Input Devices and Music Interaction Laboratory, CIRMMT, Schulich School of Music, McGill University, 555 Sherbrooke St W, H3A 1E3 Montreal, QC, Canada
Marcelo M. Wanderley*
Affiliation:
Input Devices and Music Interaction Laboratory, CIRMMT, Schulich School of Music, McGill University, 555 Sherbrooke St W, H3A 1E3 Montreal, QC, Canada

Abstract

Emotion is a word not often heard in sonification, though advances in affective computing make the data type imminent. At times the relationship between emotion and sonification has been contentious due to an implied overlap with music. This paper clarifies the relationship, demonstrating how it can be mutually beneficial. After identifying contexts favourable to auditory display of emotion, and the utility of its development to research in musical emotion, the current state of the field is addressed, reiterating the necessary conditions for sound to qualify as a sonification of emotion. With this framework, strategies for display are presented that use acoustic and structural cues designed to target select auditory-cognitive mechanisms of musical emotion. Two sonifications are then described using these strategies to convey arousal and valence though differing in design methodology: one designed ecologically, the other computationally. Each model is sampled at 15-second intervals at 49 evenly distributed points on the AV space, and evaluated using a publically available tool for computational music emotion recognition. The computational design performed 65 times better in this test, but the ecological design is argued to be more useful for emotional communication. Conscious of these limitations, computational design and evaluation is supported for future development.

Type
Articles
Copyright
Copyright © Cambridge University Press 2014 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Brazil, E., Fernström, M. 2011. Auditory Icons. In T. Hermann, A. Hunt and J.G. Neuhoff (eds.) The Sonification Handbook. Berlin: Logos.Google Scholar
Budd, M. 1985. Music and the Emotions: The Philosophical Theories. London: Routledge & Kegan Paul.CrossRefGoogle Scholar
Crooke, D. 1957. The Language of Music. London: Oxford University Press.Google Scholar
Eerola, T., Vuoskoski, J.K. 2011. A Comparison of the Discrete and Dimensional Models of Emotion in Music. Psychology of Music 39(1): 1849.CrossRefGoogle Scholar
Eerola, T., Lartillot, O., Toiviainen, P. 2009. Prediction of Multidimensional Emotional Ratings in Music from Audio Using Multivariate Regression Models. In Proceedings of the 10th International Society for Music Information Retrieval Conference. Kobe, Japan, 621–6.Google Scholar
Fastl, H., Zwicker, E. 2007. Psychoacoustics: Facts and Models, 3rd ed. Berlin: Springer.CrossRefGoogle Scholar
Gabrielsson, A., Lindström, E. 2010. The Role of Structure in the Musical Expression of Emotions. In P.N. Juslin and J. Sloboda (eds.), Handbook of Music and Emotion: Theory, Research, Applications. New York: Oxford University Press.Google Scholar
Hermann, T. 2008. Taxonomy and Definitions for Sonification and Auditory Display. In Proceedings of the 14th International Conference on Auditory Display. Paris, France, 1–8.Google Scholar
Hermann, T., Drees, J.M., Ritter, H. 2003. Broadcasting Auditory Weather Reports: A Pilot Project. In Proceedings of the 9th International Conference on Auditory Display. Boston, MA, USA, 208–11.Google Scholar
Hermann, T., Hunt, A., Neuhoff, J.G. (eds.) 2011. The Sonification Handbook. Berlin: Logos.Google Scholar
Jee, E.-S., Jeong, Y.-J., Kim, C.H., Kwon, D.-S., Kobayahi, H. 2009. Sound Production for the Emotional Expression of Social Interactive Robots. In V.A. Kulyukin (ed.), Advances in Human-Robot Interaction. Vukovar: InTech.Google Scholar
Jee, E.-S., Jeong, Y.-J., Kim, C.H., Kobayahi, H. 2010. Sound Design for Emotion and Intention Expression in Socially Interactive Robots. Intelligent Service Robotics 3: 199206.Google Scholar
Juslin, P.N., Petri, Laukka. 2003. Communication of Emotions in Vocal Expression and Music Performance: Different Channels, Same Code? Psychological Bulletin 129(5): 770814.Google Scholar
Juslin, P.N., Sloboda, J.A. (eds.) 2010. Handbook of Music and Emotion: Theory, Research, Applications. New York: Oxford University Press.Google Scholar
Juslin, P.N., Timmers, R. 2010. Expression and Communication of Emotion in Music Performance. In P.N. Juslin and J.A. Sloboda (eds.) Handbook of Music and Emotion: Theory, Research, Applications. New York: Oxford University Press.Google Scholar
Juslin, P.N., Västfjäll, D. 2008. Emotional Responses to Music: The Need to Consider Underlying Mechanisms. Behavioral and Brain Sciences 31(5): 559621.Google Scholar
Kim, Y.E., Schmidt, E.M., Migneco, R., Morton, B.G., Richardson, P., Scott, J., et al. 2010. Music Emotion Recognition: A State of the Art Review. In Proceedings of the 11th International Society for Music Information Retrieval Conference. Utrecht, Netherlands, 255–66.Google Scholar
Kramer, G., Walker, B., Bonebright, T., Cook, P., Flowers, J., Miner, N., et al. 1999. The Sonification Report: Status of the Field and Research Agenda. Santa Fe, NM: International Community for Auditory Display (ICAD).Google Scholar
Larsson, P. 2010. Tools for Designing Emotional Auditory Driver-Vehicle Interfaces. In S. Ystad, M. Aramaki, R. Kronland-Martinet and K. Jensen (eds.) Auditory Display: 6th International Symposium, CMMR/ICAD 2009, revised papers. Berlin: Springer, 1–11.Google Scholar
Lemaitre, G., Houix, O., Susini, P., Visell, Y., Franinovíc, K. 2012. Feelings Elicited by Auditory Feedback from a Computationally Augmented Artifact: The Flops. IEEE Transactions on Affective Computing 3(3): 335348.CrossRefGoogle Scholar
Matsumoto, D. 2009. Display Rules. In D. Sander and K. R. Scherer (eds.) Oxford Companion to Emotion and the Affective Sciences. New York: Oxford University Press.Google Scholar
McGookin, D., Brewster, S. 2011. Earcons. In T. Hermann, A. Hunt and J.G. Neuhoff (eds.) The Sonification Handbook. Berlin: Logos.Google Scholar
Ogihara, M., Kim, Y. 2012. Mood and Emotion Classification. In T. Li, M. Ogihara and G. Tzanetakis (eds.) Music Data Mining. Boca Raton, FL: CRC Press.Google Scholar
Picard, R. 1997. Affective Computing. Cambridge, MA: The MIT Press.Google Scholar
Picard, R. 2009. Affective Computing. In D. Sander and K.R. Scherer (eds.) The Oxford Companion to Emotion and the Affective Sciences. New York: Oxford University Press.Google Scholar
Picard, R., Daily, S.B. 2005. Evaluating Affective Interactions: Alternatives to Asking what Users Feel. In CHI Workshop on Evaluating Affective Interfaces: Innovative Approaches, Portland, OR.Google Scholar
Preti, C., Schubert, E. 2011. Sonification of Emotions II: Live Music in a Pediatric Hospital. In Proceedings of the 17th International Conference on Auditory Display. Budapest, Hungary.Google Scholar
Russell, J.A. 1980. A Circumplex Model of Affect. Journal of Personality and Social Psychology 39(6): 11611178.CrossRefGoogle Scholar
Schubert, E. 2010. Continuous Self-Report Methods. In P.N. Juslin and J.A. Sloboda (eds.) Handbook of Music and Emotion: Theory, Research, Applications. New York: Oxford University Press.Google Scholar
Schubert, E., Ferguson, S., Farrar, N., McPherson, G.E. 2011. Sonification of Emotion I: Film Music. In Proceedings of the 17th International Conference on Auditory Display. Budapest, Hungary.Google Scholar
Supper, A. 2012. The Search for the ‘Killer Application’: Drawing the Boundaries around the Sonification of Scientific Data. In T. Pinch and K. Bijsterveld (eds.) The Oxford Handbook of Sound Studies. New York: Oxford University Press.Google Scholar
Västfjäll, D., Kleiner, M., Gärling, T. 2003. Affective Reactions to Interior Aircraft Sounds. Acta Acustica United with Acustica 89: 693701.Google Scholar
Vickers, P. 2011. Sonification for Process Monitoring. In T. Hermann, A. Hunt and J.G. Neuhoff (eds.), The Sonification Handbook. Berlin: Logos.Google Scholar
Vickers, P., Hogg, B. 2006. Sonification Abstraite/Sonification Concrète: An ‘Aesthetic Perspective Space’ for Classifying Auditory Displays in the Ars Musica Domain. In Proceedings of the 12th International Conference on Auditory Display. London, UK, 210–16.Google Scholar
Vinciarelli, A., Pantic, M., Heylen, F., Pelachaud, C., Poggi, I., D'Errico, F., et al. 2012. Bridging the Gap Between Social Animal and Unsocial Machine: A Survey of Social Signal Processing. IEEE Transactions on Affective Computing 3(1): 6987.CrossRefGoogle Scholar
Walker, B.N., Nees, M.A. 2011. Theory of Sonification. In T. Hermann, A. Hunt and J.G. Neuhoff (eds.) The Sonification Handbook. Berlin: Logos.Google Scholar
Winters, R.M., Wanderley, M.M. 2013. Strategies for Continuous Auditory Display of Arousal and Valence. In Proceedings of the 3rd International Conference on Music and Emotion. Jyväskylä, Finland.Google Scholar
Winters, R.M., Hattwick, I., Wanderley, M.M. 2013. Integrating Emotional Data into Music Performance: Two Audio Environments for the Emotional Imaging Composer. In Proceedings of the 3rd International Conference on Music and Emotion. Jyväskylä, Finland.Google Scholar
Yang, Y.-H., Chen, H.H. 2011. Music Emotion Recognition. Boca Raton, FL: CRC Press.Google Scholar