Published online by Cambridge University Press: 05 August 2021
This study examined the role of visual cues (facial expressions and hand gestures) in second language (L2) speech assessment. University students (N = 60) at English-medium universities assessed 2-minute video clips of 20 L2 English speakers (10 Chinese and 10 Spanish speakers) narrating a personal story. They rated the speakers’ comprehensibility, accentedness, and fluency using 1,000-point sliding scales. To manipulate access to visual cues, the raters were assigned to three conditions that presented audio along with (a) the speaker’s static image, (b) a static image of a speaker’s torso with dynamic face, or (c) dynamic torso and face. Results showed that raters with access to the full video tended to perceive the speaker as more comprehensible and significantly less accented compared to those who had access to less visually informative conditions. The findings are discussed in terms of how the integration of visual cues may impact L2 speech assessment.
We would like to thank the members of our research group (Tzu-Hua Chen, Teng Hsu, YooLae Kim, Chen Liu, Oguzhan Tekin, and Pakize Uludag) for their valuable insights and all the research assistants who helped with data collection and coding: Marie Apaloo, Tzu-Hua Chen, Dalia Elsayed, Sarah Ercoli, Lisa Gonzalez, Xuanji Hu, Chen Liu, Ashley Montgomery, Jie Qiu, Quinton Stotz, Lauren Strachan, Kym Taylor Reid, Oguzhan Tekin, Pakize Uludag, and Roza van Lieshout. Also, we are grateful to Masaki Eguchi and Shungo Suzuki for their help with data analysis, and to the anonymous reviewers and the journal editors of Studies in Second Language Acquisition for their insightful comments and suggestions.
This research was supported by the Social Sciences and Humanities Research Council of Canada (SSHRC) grants 435-2016-1406 (to Pavel Trofimovich and Sara Kennedy) and 435-2019-0754 (to Kim McDonough and Pavel Trofimovich).
The experiment in this article earned Open Materials and Open Data badges for transparent practices. The materials and data are available at https://osf.io/vcb4a/