Book contents
- Frontmatter
- Contents
- Contributors
- 1 Introduction: Social Signal Processing
- Part I Conceptual Models of Social Signals
- Part II Machine Analysis of Social Signals
- Part III Machine Synthesis of Social Signals
- 19 Speech Synthesis: State of the Art and Challenges for the Future
- 20 Body Movements Generation for Virtual Characters and Social Robots
- 21 Approach and Dominance as Social Signals for Affective Interfaces
- 22 Virtual Reality and Prosocial Behavior
- 23 Social Signal Processing in Social Robotics
- Part IV Applications of Social Signal Processing
- References
23 - Social Signal Processing in Social Robotics
from Part III - Machine Synthesis of Social Signals
Published online by Cambridge University Press: 13 July 2017
- Frontmatter
- Contents
- Contributors
- 1 Introduction: Social Signal Processing
- Part I Conceptual Models of Social Signals
- Part II Machine Analysis of Social Signals
- Part III Machine Synthesis of Social Signals
- 19 Speech Synthesis: State of the Art and Challenges for the Future
- 20 Body Movements Generation for Virtual Characters and Social Robots
- 21 Approach and Dominance as Social Signals for Affective Interfaces
- 22 Virtual Reality and Prosocial Behavior
- 23 Social Signal Processing in Social Robotics
- Part IV Applications of Social Signal Processing
- References
Summary
Introduction
In recent years, the roles of robots have become increasingly social, leading to a shift from machines that are designed for traditional human–robot interaction (HRI), such as industrial robots, to machines intended for social HRI. As a result, the wide range of robotics applications today includes service and household robots, museum and reception attendants, toys and entertainment devices, educational robots, route guides, and robots for elderly assistance, therapy, and rehabilitation. In light of this transformation of application domain, many researchers have investigated improved designs and capabilities for robots to engage in meaningful social interactions with humans (Breazeal, 2003).
The term social robots was defined by Fong, Nourbakhsh, and Dautenhahn (2003) to describe “embodied agents that are part of a heterogeneous group: a society of robots or humans. They are able to recognize each other and engage in social interactions, they possess histories (perceive and interpret the world in terms of their own experience), and they explicitly communicate with and learn from each other” (p. 144). Other terms that have been used widely are “socially interactive robots” (Fong et al., 2003) with an emphasis on peer-to-peer multimodal interaction and communication between robots and people, and “sociable robots” (Breazeal, 2002) that pro-actively engage with people based on models of social cognition. A discussion of the different concepts of social robots can be found in Dautenhahn (2007). Note that all the above definitions consider social robots in the context of interactions with humans; this is in contrast to approaches on collective and swarm robotics (Kube, 1993; Bonabeau, Dorigo, & Theraulaz, 1999; Kernbach, 2013) which emphasise interactions among large groups of (typically) identical robots that strongly rely on communication mediated by the environment and afforded by the physical embodiment of the robots.
Together with the attempt to name and define this new category of robots, a whole new research area – social robotics – has since emerged. Social robotics research is dedicated to designing, developing, and evaluating robots that can engage in social environments in a way that is appealing and familiar to human interaction partners (Salem et al., 2013). However, interaction is often difficult as inexperienced users struggle to understand the robot's internal states, intentions, actions, and expectations. To facilitate successful interaction, social robots should therefore provide communicative functionality that is intuitive and, to some extent, natural to humans.
- Type
- Chapter
- Information
- Social Signal Processing , pp. 317 - 328Publisher: Cambridge University PressPrint publication year: 2017
References
- 6
- Cited by