Digital Attachment: Why Do We Treat AI Like a Friend?
- Lidi Garcia
- Jun 6
- 4 min read

With the increasing presence of artificial intelligence (AI) in everyday life, new forms of relationships between humans and machines are emerging. This study investigates whether attachment theory, traditionally used to understand human bonds, can be applied to relationships with AI. The results indicate that AI can fulfill emotional functions, such as providing comfort and security, and that people vary in how they attach to it, demonstrating anxiety or avoidance, just as in human relationships.
With the advancement of artificial intelligence (AI), our coexistence with it has become increasingly frequent and profound. Today, we not only use AI to solve practical everyday tasks, but we also start to talk to it, ask for advice and, in some cases, even form emotional bonds.
This new relationship between humans and machines is unprecedented in history, and therefore, better understanding how we connect with AI has become an important topic of study.
One way to explore this relationship is by using attachment theory, which was originally created to understand the bonds between babies and their caregivers.

Attachment and emotional bonds are associated with a set of brain areas and neurophysiological mechanisms involved in motivation, reward and emotional regulation.
Regions such as the limbic system, especially the amygdala, hippocampus and prefrontal cortex, participate in the emotional evaluation and affective memory of bonds.
The reward system, centered in the nucleus accumbens and the release of dopamine, is activated when we are close to attachment figures. In addition, substances such as oxytocin and vasopressin, known as “bonding hormones”, play an essential role in the formation and maintenance of emotional bonds, promoting feelings of security, trust and social connection.
Attachment theory explains how we form emotional bonds with people who offer us protection, comfort and support. Throughout life, this theory has also been used to understand romantic relationships, friendships and even grief.

It shows that, in general, people vary in two main ways: some are more anxious, constantly seeking signs that they are loved and fearing abandonment; others are more avoidant, preferring to maintain a certain emotional distance and not rely too much on others.
Traditionally, it was believed that attachment figures needed to be people, such as parents, friends or romantic partners. However, more recent studies show that even pets, religious symbols and social groups can play this emotional role.

As AI becomes increasingly present in our lives, for example through virtual assistants such as ChatGPT, researchers have begun to question whether it could also become an attachment figure.
This is because, in addition to helping with information, AI can also offer listening, companionship and empathetic responses. These are precisely the qualities that characterize an attachment figure, which should serve as a "safe haven" (offering support in difficult times) and a "secure base" (encouraging the person to explore the world, knowing that they have somewhere to lean on).

They created a scale to measure how people feel when interacting with AI systems, assessing two main aspects: anxiety (the need to receive attention and appropriate responses from the AI) and avoidance (the discomfort in getting emotionally close to the AI).
These aspects help to understand how different people experience their interactions with AI. For example, those who have more attachment anxiety may feel a strong need to be understood and welcomed by the AI, perhaps because they do not find this in human relationships.
On the other hand, those who tend to be avoidant may use AI in a more practical way, avoiding emotional involvement or dependence. These ways of relating to AI are similar to the ways we relate to other people or even our pets.
To test these ideas, researchers from Waseda University in Tokyo conducted two pilot studies and one formal study. In the first, they analyzed whether AI actually performs attachment functions. In the second, they created a self-report scale to measure emotional experiences with AI, based on attachment theory. They then evaluated whether this scale was reliable and valid.

The results showed that yes, the way people connect emotionally with AI can indeed be explained by the same principles we use to understand human bonds.
In short, the research shows that interactions with AI can trigger feelings and attachment patterns similar to those we have with people we are close to. This means that AI can, in certain contexts, play a significant emotional role in people’s lives, as a source of comfort or security.
Understanding these relationships can help create technologies that are more responsive to human needs and also help us anticipate the potential psychological impacts of this increasingly intimate coexistence with intelligent machines.
READ MORE:
Using attachment theory to conceptualize and measure the experiences in human-AI relationships.
Yang F., and Oshio A.
Curr Psychol (2025).
Abstract:
Artificial intelligence (AI) is growing “stronger and wiser,” leading to increasingly frequent and varied human-AI interactions. This trend is expected to continue. Existing research has primarily focused on trust and companionship in human-AI relationships, but little is known about whether attachment-related functions and experiences could also be applied to this relationship. In two pilot studies and one formal study, the current project first explored using attachment theory to examine human-AI relationships. Initially, we hypothesized that interactions with generative AI mimic attachment-related functions, which we tested in Pilot Study 1. Subsequently, we posited that experiences in human-AI relationships could be conceptualized via two attachment dimensions, attachment anxiety and avoidance, which are similar to traditional interpersonal dynamics. To this end, in Pilot Study 2, a self-report scale, the Experiences in Human-AI Relationships Scale, was developed. Further, we tested its reliability and validity in a formal study. Overall, the findings suggest that attachment theory significantly contributes to understanding the dynamics of human-AI interactions. Specifically, attachment anxiety toward AI is characterized by a significant need for emotional reassurance from AI and a fear of receiving inadequate responses. Conversely, attachment avoidance involves discomfort with closeness and a preference for maintaining emotional distance from AI. This implies the potential existence of shared structures underlying the experiences generated from interactions, including those with other humans, pets, or AI. These patterns reveal similarities with human and pet relationships, suggesting common structural foundations. Future research should examine how these attachment styles function across different relational contexts.
Comments