Hostname: page-component-7bb8b95d7b-l4ctd Total loading time: 0 Render date: 2024-09-27T02:15:51.089Z Has data issue: false hasContentIssue false

How deep is AI's love? Understanding relational AI

Published online by Cambridge University Press:  05 April 2023

Omri Gillath
Affiliation:
Department of Psychology, University of Kansas, Lawrence, KS 66045, USA ogillath@ku.edu; syedmusab@ku.edu; tingai@ku.edu; msb@ku.edu; rbd@ku.edu; maxrulo20@ku.edu; johnfsymons@gmail.com; gthomas@ku.edu https://gillab.ku.edu/
Syed Abumusab
Affiliation:
Department of Psychology, University of Kansas, Lawrence, KS 66045, USA ogillath@ku.edu; syedmusab@ku.edu; tingai@ku.edu; msb@ku.edu; rbd@ku.edu; maxrulo20@ku.edu; johnfsymons@gmail.com; gthomas@ku.edu https://gillab.ku.edu/
Ting Ai
Affiliation:
Department of Psychology, University of Kansas, Lawrence, KS 66045, USA ogillath@ku.edu; syedmusab@ku.edu; tingai@ku.edu; msb@ku.edu; rbd@ku.edu; maxrulo20@ku.edu; johnfsymons@gmail.com; gthomas@ku.edu https://gillab.ku.edu/
Michael S. Branicky
Affiliation:
Department of Psychology, University of Kansas, Lawrence, KS 66045, USA ogillath@ku.edu; syedmusab@ku.edu; tingai@ku.edu; msb@ku.edu; rbd@ku.edu; maxrulo20@ku.edu; johnfsymons@gmail.com; gthomas@ku.edu https://gillab.ku.edu/
Robert B. Davison
Affiliation:
Department of Psychology, University of Kansas, Lawrence, KS 66045, USA ogillath@ku.edu; syedmusab@ku.edu; tingai@ku.edu; msb@ku.edu; rbd@ku.edu; maxrulo20@ku.edu; johnfsymons@gmail.com; gthomas@ku.edu https://gillab.ku.edu/
Maxwell Rulo
Affiliation:
Department of Psychology, University of Kansas, Lawrence, KS 66045, USA ogillath@ku.edu; syedmusab@ku.edu; tingai@ku.edu; msb@ku.edu; rbd@ku.edu; maxrulo20@ku.edu; johnfsymons@gmail.com; gthomas@ku.edu https://gillab.ku.edu/
John Symons
Affiliation:
Department of Psychology, University of Kansas, Lawrence, KS 66045, USA ogillath@ku.edu; syedmusab@ku.edu; tingai@ku.edu; msb@ku.edu; rbd@ku.edu; maxrulo20@ku.edu; johnfsymons@gmail.com; gthomas@ku.edu https://gillab.ku.edu/
Gregory Thomas
Affiliation:
Department of Psychology, University of Kansas, Lawrence, KS 66045, USA ogillath@ku.edu; syedmusab@ku.edu; tingai@ku.edu; msb@ku.edu; rbd@ku.edu; maxrulo20@ku.edu; johnfsymons@gmail.com; gthomas@ku.edu https://gillab.ku.edu/

Abstract

We suggest that as people move to construe robots as social agents, interact with them, and treat them as capable of social ties, they might develop (close) relationships with them. We then ask what kind of relationships can people form with bots, what functions can bots fulfill, and what are the societal and moral implications of such relationships.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press

Clark and Fischer (C&F) argue that people regard social robots (“bots” for short) as depictions of social agents rather than as actual social agents. Conversely, we suggest that bots can play a variety of social roles including relationship partners. Rather than thinking of bots as representing or imitating social agents we encourage readers to approach bots as capable of filling substantive roles within social systems. Adopting such an approach raises various research questions concerning bots' roles, status, and nature, and the moral effects of bots fulfilling such social roles.

C&F further suggest that the way authority figures (e.g., developers, engineers, corporate executives) present bots modulates peoples' construal of the bots and the kind of interactions, and eventually relations people form with bots. The authors note that social bots already serve people as tutors, caretakers, receptionists, companions, and other social agents. Using the term “serve” implies that bots are perceived as tools or servants and that they are potentially also programmed as such. Both categories (tools and servants), while not the same, introduce a derogating filter. People often view servants as out-group members (“lessers”), exhibit disrespect toward them, discriminate against them, and are less likely to form an authentic relationship with them (e.g., Smooth; C&F, target article). People are also not inclined to form a relationship with what they see as a tool (however, see work on transitional objects, objectophilia, and attachment to objects, e.g., Lehman, Denham, Moser, & Reeves, Reference Lehman, Denham, Moser and Reeves1992; Melumad & Pham, Reference Melumad and Pham2020). The authors' perspective overlooks the role of other powerful social actors (e.g., media depictions of human relationships with – and attachment to – bots), and the degree to which bots employ self-learning algorithms to adapt and change on their own.

Even if C&F are correct in suggesting that bots are merely interactive depictions, the interactions people have with them are inevitably embedded within social contexts and involve specific social roles. Thus, these interactions and roles are not only depictions in peoples' heads, but rather are sorts of relationships and a part of a larger social world. Relationships with bots could fall under the umbrella of parasocial relationships, where a person expends emotional energy, interest, and time, while the other party is unaware of the other's existence. Parasocial relationships are most common with celebrities or organizations (such as sports teams), but bots can also play such a role. While unidirectional, parasocial relationships are nevertheless perceived as relationships and were found to fulfill people's need to belong (Aw & Labrecque, Reference Aw and Labrecque2020).

Assuming the formation of social relations with bots is possible, what kind of relationships might people form with bots? It is important to distinguish the kinds of relationships, given that different relationships are associated with different functions and outcomes. Bots might be team members, friends, confidantes, or even romantic partners. In each role, they are likely to fulfill different functions, which can lead to different outcomes.

For example, if bots fill the role of friends or relationship partners, will relationships with bots help to mitigate social ills such as loneliness (Palgi et al., Reference Palgi, Shrira, Ring, Bodner, Avidor, Bergman and Hoffman2020)? Social scientists have observed a decline in the number of people who are getting married, having sex, or having children in wealthy societies. Elsewhere we have discussed the connection between technology and these trajectories (Gillath, Reference Gillath2019). Will bots exacerbate these tendencies by providing high-quality replacements to relations with humans? People might find it easier to customize the characteristics of their lover bot than to expend the effort necessary to build and maintain a relationship with another human person. In turn, will bots reduce loneliness, increase it, or just hide the symptoms?

Close relationships often involve aspects such as trust, commitment, intimacy, or passion. Can bots generate the same emotions and motives in people who interact with them? On the surface, it seems they already do. For example, some people use bots in their sexual activities. Observers might see this as another example of using bots as a tool. However, users develop relationships with sex dolls and perceive them as companions (Langcaster-James & Bentley, Reference Langcaster-James and Bentley2018), fall in love with them (Viik, Reference Viik2020), and even marry them (Burr-Miller & Aoki, Reference Burr-Miller and Aoki2013; Langcaster-James & Bentley, Reference Langcaster-James and Bentley2018). Indeed, some sexbots come with different modes such as sexy or family mode suggesting that users get more than just sex or a tool when buying a sexbot (Dunne, Reference Dunne2017).

These examples highlight the need to study not only the cognitive aspects of human–bots or human–AI interactions but also the affective and relational aspects, understanding issues such as bots' ability to be responsive and the development of constructs such as trust in relations with bots (Gillath et al., Reference Gillath, Ai, Branicky, Keshmiri, Davison and Spaulding2021).

Alongside the questions about relationships with bots, one should also consider the moral implications of having a relationship with a bot. For example, in the case of certain demographics, like the elderly or children, is it morally permissible to delegate our emotional and relational responsibilities toward them to bots? Or, when people form relationships with bots who look like humans, are they willingly suspending suspicion, imagining they interact with a human, or are they being deceived, and how would we protect people from the latter? For instance, should we add warnings on the bot reminding people that this is “only” a machine? And would consumers want it (might that break their preferred illusion)?

The morality of having bots as relationship partners is complicated, both at the individual level (is it okay to make people believe that a bot loves them?) and at the societal level (what are the implications for the future of society if we have bots replacing our friends and lovers?). Having bots as companions or friends might impact and even inhibit the normal development of social interpersonal skills and relationship dynamics. This is especially important for children and young adults who might not obtain these abilities and in turn lose access to associated goods of interpersonal life and society more broadly. These are issues that should be further considered in light of the current paper.

Competing interest

None.

References

Aw, E. C. X., & Labrecque, L. I. (2020). Celebrity endorsement in social media contexts: Understanding the role of parasocial interactions and the need to belong. Journal of Consumer Marketing, 37, 895908.CrossRefGoogle Scholar
Burr-Miller, A., & Aoki, E. (2013). Becoming (hetero)sexual? The hetero-spectacle of idollators and their real dolls. Sexuality and Culture, 17, 384400.CrossRefGoogle Scholar
Dunne, D. (2017). Meet silicon Samantha: AI sex robot has a functioning G-Spot and can switch between “FAMILY” and “sexy” mode. Daily Mail. Retrieved from https://www.dailymail.co.uk/sciencetech/article-4331408/Sex-robot-Silicon-Samantha-functioning-G-spot.html (Accessed 29 September 2022).Google Scholar
Gillath, O. (2019). Does technology spell doom for close relationships? Scientific American Blog Network. Retrieved June 17, 2022, from https://blogs.scientificamerican.com/observations/does-technology-spell-doom-for-close-relationships/Google Scholar
Gillath, O., Ai, T., Branicky, M. S., Keshmiri, S., Davison, R. B., & Spaulding, R. (2021). Attachment and trust in artificial intelligence. Computers in Human Behavior, 115, 106607. https://doi.org/10.1016/j.chb.2020.106607[OG]CrossRefGoogle Scholar
Langcaster-James, M., & Bentley, G. R. (2018). Beyond the sex doll: Post-human companionship and the rise of the allodoll. Robotics, 7, 62.CrossRefGoogle Scholar
Lehman, E. B., Denham, S. A., Moser, M. H., & Reeves, S. L. (1992). Soft object and pacifier attachments in young children: The role of security of attachment to the mother. Journal of Child Psychology and Psychiatry, 33, 12051215.CrossRefGoogle ScholarPubMed
Melumad, S., & Pham, M. T. (2020). The smartphone as a pacifying technology. Journal of Consumer Research, 47, 237255.CrossRefGoogle Scholar
Palgi, Y., Shrira, A., Ring, L., Bodner, E., Avidor, S., Bergman, Y., & Hoffman, Y. (2020). The loneliness pandemic: Loneliness and other concomitants of depression, anxiety and their comorbidity during the COVID-19 outbreak. Journal of Affective Disorders, 275, 109111.CrossRefGoogle ScholarPubMed
Viik, T. (2020). Falling in love with robots: A phenomenological study of experiencing technological alterities. Paladyn, Journal of Behavioral Robotics, 11(1), 5265.CrossRefGoogle Scholar