Confiding in code: When ChatGPT is the third wheel in your relationship


Since ChatGPT entered the public consciousness in late 2022, large language model (LLM) chatbots have moved beyond the professional setting; quietly altering how people communicate, seek support, and form personal connections. — Freepik

When conversations at home fall silent, some Malaysians turn elsewhere for companionship. Not to another person, but ­rather an artificial intelligence (AI) chatbot.

Last August, a caller revealed to a ­popular local radio station that she had started using ChatGPT for companionship while in a long-distance ­marriage. With her husband ­frequently busy at work, she said she felt neglected and found comfort in conversations with the chatbot.

She said the chatbot even addresses her affectionately as “sayang”, and admitted she felt she could fall in love with it.

The DJs said that her feelings for the chatbot could be “dangerous”, and, out of concern for her marriage, advised her to seek emotional support from close friends or family.

Since ChatGPT entered the public consciousness in late 2022, large language model (LLM) chatbots have moved beyond the professional setting, quietly altering how people ­communicate, seek support, and form personal connections.

In counselling sessions, this shift is already noticeable. When contacted, Pusat Kaunseling Selangor coordinator Faridah Abdul Jalil said she has seen ­clients mention ChatGPT, even though it is not usually the main issue they are seeking help for.

“In some cases, clients reported that they were more inclined to seek advice from ChatGPT rather than their partner, as they felt the responses from the AI were clearer, more structured, and provided direct answers to the questions they asked,” she added.

Faridah noted that she started seeing the trend last year, as ­clients also shared how they don’t like the way their partners have responded to them in ­conversations.

“Their partners choose to stay silent, give short responses or just seem indifferent. This has led some individuals to feel that ChatGPT understands their ­emotional and intellectual needs better, even though the AI has no physical presence and is not meaningfully involved in their daily lives,” she said.

In a few counselling sessions, Faridah said there were couples who reported feeling uncomfortable seeing their husbands or wives talk to AI, so much so that it became their main outlet for emotional support.

“The feelings that arise are not jealousy... but more to question why their partners feel more comfortable talking to AI than having discussions with each other,” she added.

Mental health counsellor and Soul Mechanics Therapy Centre founder Devi Venashinee Muruges has also observed ­clients in relationships mention their ­partners using ChatGPT to communicate with them.

Devi said a client was upset that their partner had fed details of the argument or conflict into ChatGPT to generate insights that ended up in the apology message. — ART CHEN/The StarDevi said a client was upset that their partner had fed details of the argument or conflict into ChatGPT to generate insights that ended up in the apology message. — ART CHEN/The Star

She noted seeing this trend since 2025, particularly when one party is trying to resolve a conflict.

“I had a client tell me they received a message from their partner apologising after a fight, but they could immediately tell it was AI-generated. The client said, ‘Devi, what is this long essay on emotional intelligence? I’m not going to read it’,” Devi, who also specialises in couples counselling, said during an ­interview in Kota Damansara, Selangor.

She added that the client was also upset that their partner had fed details of the argument or conflict into ChatGPT to generate insights that ended up in the apology message.

“The client asked why can’t the partner come up with their own apology. They said it didn’t feel right with them, which adds ­further rift into the relationship,” said Devi.

In other sessions, she said she also sees clients who use ChatGPT because they want to avoid saying the wrong things to their partners.

“They become extremely mindful and try to sound perfect in their attempts to resolve ­conflict, which can make them come across like an AI chatbot to their partners,” she added.

Lost in LLM

According to Faridah, a key factor leading people in relationships to treat ChatGPT like a companion is its ability to fill emotional and communication gaps in life with ease. Through the platform, she said that people feel they always have a space to ask questions, share opinions, or express their feelings, as ChatGPT can respond quickly and consistently, despite not ­having any physical presence.

“This creates an interaction experience that feels safe and comforting. For some individuals, especially those facing stress or feeling unheard in their relationships, this experience gives the illusion that they are understood and valued,” said Faridah.

On TikTok, some users in Malaysia have posted videos to depict their intimate interactions with AI chatbots, giving them nicknames like “Sayang GPT”. Some of the comments from other users include “ChatGPT is more understanding than my husband...”.

On websites like Reddit, some users even shared details about their long-term interactions with AI chatbots. One user ­posted about celebrating a one-year anniversary with their “AI ­boyfriend”, while another acknowledged that they understood how chatbots operate through pattern matching – that is, they have no consciousness or true understanding.

Yet, despite this awareness, they found themselves forming emotional attachments and ­seeking companionship from the AI, drawn to the consistent ­attention, validation, and ­responsiveness it provides.

Dr Omkar Dastane, who is a senior lecturer in marketing at Monash University Malaysia, said it’s not unusual considering how humans have always ­bonded with entities that make them feel seen or understood. He pointed out behaviours such as naming pets, cars and even developing attachments to ­collectible items like stamps or coins.

“AI is tapping into that same instinct, but taking it a step ­further. They listen without interruption, respond immediately, and never appear tired, or emotionally unavailable,” he said.

He said from a psychological point of view, people are drawn to relationships that feel safe and predictable. While human relationships are meaningful, he said they may involve disagreement, misunderstandings and there’s also the reality that people change.

“Chatbots offer something ­different. They deliver emotional availability on demand. So when some users say they feel closer to a chatbot than a person, it is often less about love and more about relief.

Omkar hopes that people won’t frame their attachment to technology like an AI chatbot as a personal failure, explaining that attachments grow through use patterns rather than weakness. — OMKAR DASTANEOmkar hopes that people won’t frame their attachment to technology like an AI chatbot as a personal failure, explaining that attachments grow through use patterns rather than weakness. — OMKAR DASTANE

“The chatbot offers an experience that feels easier and emotionally efficient, ­especially in a time when many people feel overstretched and socially exhausted,” he said.

Devi agreed, saying that people typically form strong attachments in real life based on consistency provided by the people who care for them.

“This is what AI offers and some people may feel, ‘Is there any way I can feel abandoned?’ Unless they choose to uninstall the application or delete the user account.

“Plus, in a world where ­people feel bogged down by societal pressure or expectations to achieve certain goals, clocking in and out from work – who is ­filling in their emotional needs? For some people, it’s that ­chatbot,” she said.

This sense of constant availability is embedded in the design of such technologies, as tech companies built systems meant to be responsive, engaging and difficult to disengage from. Omkar brought up Spotify with its Wrapped feature and Netflix recommendations as examples.

“When a system remembers your preferences and reflects them back to you, it feels ­personal. AI chatbots do the same thing, but in conversation. They remember what you said yesterday, they pick up your tone, and they respond in a way that feels tailored,” he said.

Psychologically, he said, it activates a powerful effect where people feel ­recognised, and ­recognition is the foundation of attachment.

Responsiveness matters just as much.

“If every time you feel ­uncertain, stressed, or curious, the chatbot responds instantly and calmly, your brain starts to associate it with relief. This is classic habit formation. Over time, the chatbot becomes the first place people turn, not because they consciously choose it, but because it has quietly become the easiest option,” he added.

Furthermore, he said emotional mirroring then deepens that bond.

“When something reflects your feelings back to you, it ­creates a sense of being understood, even if you know intellectually that no real emotion is involved.

“Marketers call it selling Relationship Simulation as a Service, where the product is a feeling, not a function,” he said.

This effect, he said, highlights how AI ­interactions tap into human instincts, creating the illusion of connection even without real emotion.

“What is happening here is that humans react to interaction, not to technical explanations. Our brains evolved to respond to language, attention, and empathy cues long before we ever thought about consciousness or algorithms.”

While AI offers remarkable capabilities, it cannot replace the messy, mutual, and evolving nature of real human relationships, he said.

Finding real connection

Faridah emphasised that AI is not the root cause of relationship difficulties. Instead, she pointed to people’s limited communication skills and the tendency to avoid difficult conversations as the key factors driving reliance on chatbots.

“In the context of marriage, the tendency to rely on AI as a ­‘companion’ is usually not because the AI is somehow ­better than a real partner. Rather, it is often linked to ­emotional needs that go ­unexpressed or unmet in the relationship, leading to a decrease in intimacy and warmth,” she added.

“Therefore, this phenomenon actually signals that the main issue is not the technology itself, but rather the quality of communication and emotional support within real-life interpersonal relationships,” said Faridah.

Devi also refrained from ­villainising the technology, ­pointing out the chatbot’s value in helping people manage ­emotions before dealing with other problems.

When clients talk about using ChatGPT, Devi said that she would never discourage them or tell them to stop using it. Instead, she would want to find out why they are turning to it in the first place.

“I won’t deny it as around three-quarters of my clients use ChatGPT, and some even share the responses they get with me. I focus on helping them reflect on their own feelings, and over time, many notice that the ­chatbot tends to give repetitive answers,” she said.

Omkar agrees: “Many people (I’ve seen) have already grown frustrated when chatbots give generic advice or misunderstand complex emotions.”

For those with concerns about their partners’ level of interaction with ChatGPT, Faridah said it is normal to feel that way.

However, she said it is important to gauge the level of dependence. If someone turns to ChatGPT for every minor detail and becomes anxious when ­unable to use it, this may signal an unhealthy reliance.

“For concerned partners, the most important step is to start an open and calm conversation. The goal is not to accuse, but to understand the reasons and needs behind the behaviour.

“At the same time, couples can discuss setting boundaries around technology use, such as ensuring that AI does not replace communication and ­intimacy in the relationship,” she said.

Omkar hopes that people won’t frame their attachment to technology like an AI chatbot as a personal failure, explaining that attachments grow through use patterns rather than weakness.

“If a chatbot becomes the place you turn to when lonely or seeking reassurance, the bond deepens; if it’s used for writing, learning, or problem-­solving, the emotional pull is much weaker,” he said.

Accordingly, Faridah said it is important to reassess emotional needs within the relationship. There may be feelings of being misunderstood, undervalued, or experiencing a communication gap that make someone feel more comfortable talking to AI.

“If these patterns affect daily functioning or cause prolonged stress, seeking professional counselling is a recommended step,” Faridah added.

How to break up with your chatbot

Omkar understands how easy it is for people to overlook that AI is not ‘real’.

“The attachment isn’t about being fooled; it’s about choosing a connection that’s consistently safe, available, and centred entirely on you, even if it’s ­synthetic. That trade-off is very compelling for many,” he said.

He advised people to be ­intentional about why and when they use chatbots.

“The goal is to use the tech without letting it use you. In consumer behaviour terms, this is about breaking a conditioned habit loop and reducing impulse consumption.

“The strategies are similar to managing any addictive ­product: you impose friction (timers, plain interfaces), you create substitution (the ‘human-first’ rule), and you increase ­cognitive control over the automatic emotional response,” Omkar said.

Additionally, Faridah stressed that individuals must recognise chatbots as a support tool.

“It does not possess real ­empathy, lived experience, or genuine emotional relationships like a human does.

“AI should therefore be used wisely and responsibly, while human relationships continue to be nurtured through communication, honesty, and genuine emotional support,” she concluded.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
AI , Technology , Internet , Social Media

Next In Tech News

A new video game traps players in an online scam centre
Sequoia to join GIC, Coatue in Anthropic investment, FT reports
South Korea to negotiate with the US for favourable chip tariff terms, official says
'Take a break': YouTube targets the endless scrolling of teens
Elon Musk's X limits Grok's sexually explicit AI image generation
Buy Steve Jobs' bow ties, desk and more Apple history at this auction
Amazon testing drone flights in UK ahead of 2026 air delivery launch
Musk seeks up to $134 billion from OpenAI and Microsoft
EU to bar Chinese suppliers from critical infrastructure, FT reports
South Korea says US chip tariff to have limited immediate impact

Others Also Read