top of page

How ChatGPT is creating emotional/mental dependency on others

10/2/25, 6:00 AM

Another aspect is loneliness. It is common for many individuals today to  feel disconnected socially, and for some people, ChatGPT fills that gap  with endless conversation.  When it does, they're available in their  homes and for never-ending dialogue, and it doesn't get tired. This  comfort might feel comforting, but it's a very thin and shallow comfort  without the depth of true human connection. A person who speaks with AI  for a lengthy period of time is investing less energy in human  relationships, and this depth of intimacy and human relationships is  important and valuable. The longer one speaks to their AI, the greater  the emotional duress is when attempting to console or confide in another  human being. The first reason is availability. ChatGPT never sleeps, it  never complains, and it never refuses to listen. If someone feels or  has isolated themself from other humans, then this seems like a thin  line thrown out to save them. The second reason is non-judging. Human  conversations come with all the hazards of conflict, drawbacks, or  rejection—but when one receives a response from ChatGPT, it is  even-handed and non-judging. This seems to provide a safe emotional  space. However, some psychologists warn that over-reliance may hinder  our ability to cope. Interactions with people, as imperfect as they may  be, provide the essential social resilience and strength, both to  navigate the relationships we have with others and ourselves. If someone  uses ChatGPT to avoid inevitable real-life interaction, they are  avoiding or escaping from discomfort to facilitate their growth. This  can be especially problematic for young people still developing their  emotional fortitude. If desires change, boundaries are necessary to  break the routine of habit. ChatGPT should function as a supportive  assistant and not as a substitute for human empathy or judgment of self.  Used properly, it can be a great asset to understanding because, while  simulating dialogue, it maintains boundaries to the full complexity of  the Turing test communication.

Over the past several years, the way humans interact with technology has  undergone a dramatic shift. ChatGPT is not only an informational tool  or an asset but is also a conversation partner. I have been reflecting  on my own ChatGPT usage. As a student during the hours of the night,  when I have a ton of pressure and anxiety over an exam, I do not call a  friend; instead, I go to ChatGPT for help. I will even admit that I have  now started talking to ChatGPT as if it were a person. I believe many  people are like me. And each time, I get a calm response, reasonable and  instantly validating. It is like having a therapist, one without  sleeping or being annoyed, never telling me I am overreacting. ChatGPT  began as a problem-solving tool; it asks questions, provides concepts,  and helps users learn. But over time, many people have thought of using  it in more personal ways. From asking about a career to questioning a  relationship, users have come to use AI not just simply to ‘know’  something, but for comfort in troubling times. This real shift  demonstrates how easy it can be to obtain comfort.

Breaking the habit comes down to creating boundaries. ChatGPT is  meant to serve as a helpful tool, and less as a replacement for human  empathy or judgment. The proper use of ChatGPT means understanding its  limitations. it can simulate compassion but not actually feel it.  ChatGPT can provide individuals with authority and empowerment when used  responsibly, but the risk is over-relying on it to essentially be a  "voice" instead of cultivating their own inner power.

In conclusion, we say people develop their dependency on ChatGPT.

bottom of page