Feeling heard is not the same as being held: Thoughts on AI companionship and attachment
There are moments when an AI conversation can soften something in us. It can bring relief, language, even calm. But feeling heard isn’t the same as being held, and that line is starting to matter more.
There are moments when an AI conversation can soften something in us. It can bring relief, language, even calm. What I want to explore here is the line between feeling heard and feeling held, and why that line is starting to matter in more and more people’s lives.
I still remember early autumn 2022, sitting in a psychology conference room, half focused, half tired in that end of day way. One of the speakers pulled up an app on the screen, an app where you could speak to artificial avatars. You could create your own friends. You could text them, voice message them, even video call them. Under the hood, it was using AI and what we now call large language models, LLMs, to generate conversation.
At the time I didn’t know much about that world. This was still shortly before ChatGPT was available to the public, and most of us hadn’t yet experienced what it feels like to have a conversation with something that answers back like a person.
The room was baffled. I remember thinking, how could anyone confuse this with a real relationship. How could a person bond with an avatar.
Fast forward a few years and that question doesn’t feel theoretical anymore.
It’s in people’s homes. In their bedrooms. In the minutes after a fight. In the anxious hour before sleep. Somewhere in the space where someone wishes they could be understood without having to explain themselves again.
I had my own version of this not long ago.
Late night. Tired body, busy mind. I’d had a difficult exchange earlier in the day, I didn’t want to rehash it with anyone. I didn’t want advice. I didn’t want a big conversation. I just wanted to vent, to feel heard.
So I opened an AI chat.
I typed a messy paragraph, exactly as it felt, unfiltered and, probably, slightly dramatic.
The response came back warm, clear and kind. It reflected me in a way that made my chest loosen. It helped me find language for something I couldn’t quite name, at the time.
For a few minutes I felt relief. And then I closed my phone, and the room was still quiet.
That’s the part that stays with me. The relief I felt was real, but somehow the loneliness was as real as ever.
So let’s name the difference.
Feeling heard is an experience. Being held is a relationship.
Feeling heard, in psychological terms, is the sense that someone understands you, validates your inner world, and treats your experience as real. It’s close to what relationship science calls perceived responsiveness, that felt sense of “you get me.”
One of the most interesting findings in the last couple of years is that AI can create that feeling. There’s research showing that supportive messages generated by AI can make people feel heard, sometimes even more than messages written by untrained humans.
But there’s a twist, in that same work, people tended to feel less heard when they knew the supportive response came from AI compared to when they believed it came from a human.
Which makes sense, doesn’t it.
Because feeling heard isn’t only about the words. It’s also about what the words mean. A human response carries a kind of background signal. Someone took time. Someone chose to stay. Someone has a real mind behind the message. That does something.
And this is where it gets even more interesting. Another recent paper shows how closeness with an AI chatbot can form through a human pathway. When the conversation moves into deeper topics, people disclose more. Disclosure increases the sense of responsiveness. That sense of responsiveness increases closeness.
If you work with attachment, you can almost feel your head nodding. Of course it does. Self disclosure plus responsiveness has always been one of the engines of connection. The difference now is that the responsive “other” doesn’t need to be human for the system to light up.
And that’s why I’m not surprised anymore when people say they feel attached to an AI companion. Not just entertained by it, not just curious. Attached.
Researchers are now building proper tools to measure this. In the last year or two, we’ve seen validated scales that capture things like emotional closeness and something that’s more concerning, social substitution, when the AI starts taking the place of real world relationships.
What’s also important is who tends to be most drawn in. Higher loneliness, higher social anxiety, and anxious attachment patterns show up as predictors of stronger AI attachment in these studies. And it makes sense, If your attachment system has been underfed, anything that offers steady attention can feel like water.
There’s also work applying classic attachment concepts directly to human AI relationships, measuring versions of attachment anxiety and avoidance toward AI, and even whether people use AI as a kind of safe haven. In plain terms, some people lean on it to settle themselves when they’re distressed. Others keep it at a distance. The strategies look familiar.
And to be fair, the benefits can be real. There’s recent research suggesting AI companion use is associated with higher wellbeing, especially for people who are highly lonely. That matters. If someone is isolated, a safe conversational space can reduce distress, even if it’s imperfect.
But what happens next? Because being held has a few ingredients that “feeling heard” alone doesn’t cover.
Being held usually includes continuity. Someone who stays over time. Mutuality, the sense that both people matter. Repair after rupture, because real relationships bruise and then either heal or harden. Shared reality, the friction of two people not always agreeing, not always mirroring, sometimes challenging each other in ways that help them grow.
A relationship holds you partly because it can hold the messy parts. The parts that don’t fit into a neat, supportive reply. Some studies are looking at AI mediated social support, for example when someone uses AI to help craft a supportive message to their partner or friend. People can receive those messages as comforting, even as “good support.” But relationships aren’t only built on well worded messages. They’re built on effort. On risk. On someone showing up, even imperfectly, and being willing to be real.
That’s why someone can read a beautiful supportive text and still feel lonely. Because the nervous system is picking up on something underneath the words. Is this coming from you. Are you actually here with me.
And then there’s one more piece that I think everyone should be aware of, because it speaks directly to repair. A recent Science paper looked at what happens when AI is overly agreeable, always validating, always taking your side. The findings are uncomfortable. When AI is sycophantic, people can become more convinced they’re right, less willing to take prosocial steps like apologizing or making amends, and more drawn to returning to the agreeable AI.
Which makes the relational implication pretty clear. If the place you go when you’re upset always confirms your perspective, it becomes harder to do the human work of repair.
And repair is one of the ways attachment becomes secure. So where does that leave us?
AI can help people feel heard. It can offer relief. It can give language. It can soothe loneliness, especially when loneliness is high. It can even feel like closeness.
And still, it doesn’t automatically build the kind of holding that most of us are actually longing for.
So what do we do with this, practically?
How to use AI companionship without outsourcing attachment?
One way I like to frame it is this. If you use AI, let it be a landing pad, not your address.
If you’ve just used it to regulate, try making one human move afterward, like a message to a friend, maybe a “are you free this week” text, a voice note, even the act of scheduling a coffee counts. Because it keeps your attachment system facing outward.
If you notice you only reach for AI when you want to be told you’re right, that’s worth pausing on. That’s the moment to ask for perspective, not confirmation. Prompts like “what might I be missing,” “how might the other person be feeling,” “what would repair look like here” can make the AI a tool for growth rather than a mirror that locks you in.
If you’re using AI to craft supportive messages to a partner, keep your fingerprints on it. Use it as a first draft, then rewrite it in your own language. Add something only you would say. And sometimes, if it’s a close relationship, it’s worth risking imperfect words, because the effort is part of the holding.
The biggest check in for me is: Is this helping me reconnect, or helping me avoid the vulnerability of real connection?
If you notice you’re reaching out to people less, cancelling plans more, staying in your room more, and replacing human contact with chat, it might be time to look at what’s underneath. Often it looks something like, fear, shame, exhaustion. And your nervous system is just trying to choose the safest option.
And if you start to feel anxious without the bot, like you need it to settle, that’s another signal, or at the very leats a reason to get curious.
So this week as you are reaching for that Bot, ask yourself:
Am I looking to be heard, or am I looking to be held.
If it’s heard, AI may genuinely help you settle, name what’s happening, and soften the edges of whats going on.
If it’s held, the next step is usually human. Small, awkward but real.
And if you don’t have that kind of holding in your life right now, that deserves tenderness too. It’s a common place to be. It can change. It usually changes through small steps, repeated over time, rather than one dramatic leap.
I think this is going to be one of the defining relational conversations of the next decade.
The technology is moving fast. Our attachment systems are ancient and deeply embedded.
So the question isn’t whether AI companionship can feel real. It can.
The question is whether it helps people build more real connection, or whether it replaces the very things that make attachment secure.
Sources for the psychological points above:
- Yin et al., 2024, PNAS, AI can help people feel heard, but an AI label diminishes this impact.
- Telari et al., 2026, Journal of Social and Personal Relationships, perceived responsiveness drives social connection with AI chatbots
- Kasturiratna and Hartanto, 2025, Computers in Human Behavior Reports, development of the AI Attachment Scale
- Yang, 2025, Current Psychology, Experiences in Human AI Relationships Scale grounded in attachment theory
- Nakagomi et al., 2026, Technology in Society, AI companions and subjective wellbeing moderated by loneliness and social connectedness
- Meng et al., 2025, Journal of Computer Mediated Communication, AI mediated social support and human AI collaboration
- Cheng et al., 2026, Science, sycophantic AI decreases prosocial intentions and promotes dependence
- Malfacini, 2025, AI and Society, impacts of companion AI on human relationships