Book contents
- Frontmatter
- Contents
- Figures
- Tables
- Preface
- Chapter 1 Introduction to Eye-Tracking
- Chapter 2 Choosing the Equipment
- Chapter 3 Practicalities of Eye-Tracking
- Chapter 4 Researching Reading
- Chapter 5 Researching Listening and Multimodal Input
- Chapter 6 Using Eye-Tracking in Other Contexts
- Chapter 7 Working with the Data
- Chapter 8 Conclusions
- References
- Index
5 - Researching Listening and Multimodal Input
Published online by Cambridge University Press: 14 September 2019
- Frontmatter
- Contents
- Figures
- Tables
- Preface
- Chapter 1 Introduction to Eye-Tracking
- Chapter 2 Choosing the Equipment
- Chapter 3 Practicalities of Eye-Tracking
- Chapter 4 Researching Reading
- Chapter 5 Researching Listening and Multimodal Input
- Chapter 6 Using Eye-Tracking in Other Contexts
- Chapter 7 Working with the Data
- Chapter 8 Conclusions
- References
- Index
Summary
Introduction
It's clear that reading is a visual task, so it is fairly obvious why eye-tracking is an appropriate methodology to study it. But how (and why) might we use eye-tracking to study listening?
When listening, most of us tend to look at something. In experimental contexts, we can manipulate what participants see so that the visual input has various relationships to what is heard and then explore how these relationships influence looking patterns. In many authentic contexts the auditory and the visual input also have a close connection to each other. In audio storybooks for children, the pictures correspond to the story that children hear. In other situations, like watching a film, the correspondence between the auditory and the visual input might seem less clear – what characters talk about doesn't necessarily relate to the visual environment. Nevertheless, eye-tracking can be informative in a number of ways. For instance, if listeners generally look at someone who is speaking, what happens when subtitles are present on the screen? Eye-tracking can address this question, as well as many others in the context of listening and multimodal input, which will be the focus of this chapter.
The discussion of reading in Chapter 4 showed that there are limits on the eyes’ visual acuity. Vision is sharp around the fovea but decreases moving outwards. The eyes move (saccade) three to four times a second, with brief pauses (fixations), to bring new regions into an area of good visual acuity. The constraints on the visual system apply to viewing static and moving images, just as they did to reading, which means that the basic pattern of saccades and fixations is observed across a wide range of tasks. Importantly, eye-movements in both reading and viewing are thought to be under cognitive control (Luke and Henderson, 2016; Rayner, 2009; but see Vitu, 2003, for an alternative viewpoint). This means that the cognitive processes related to perception, memory and language influence the ‘where and when’ of eye-movements (Luke and Henderson, 2016). For applied linguists, what is important is that the eyes move quickly in response to the linguistic input and the perceptual features of what is being looked at.
- Type
- Chapter
- Information
- Eye-TrackingA Guide for Applied Linguistics Research, pp. 112 - 138Publisher: Cambridge University PressPrint publication year: 2018