Though helpful at first, AI cannot replicate a therapist’s intuition, with experts flagging a rising dependence on AI for therapy
x

Though helpful at first, AI cannot replicate a therapist’s intuition, with experts flagging a rising dependence on AI for "therapy"

Is AI psychosis worth it? Experts flag rising dependence on AI

As more people turn to AI chatbots for emotional support, experts warn of growing “AI psychosis” and the dangers of digital over-reliance.


With digital innovation comes the pop-up of AI chatbots like the popular ChatGPT by OpenAI, or Perplexity or even Siri or Alexa on your phone. But when does the line get drawn between using Artificial Intelligence (AI) for personal needs such as official work or to come up with ideas, and to depending on your AI chatbot “friend” as you choose to pour your heart out to it?

Also read: Health Silent crisis of men’s mental health and the need to break stigma | Second Opinion

There is a concerning and rising trend of people growing emotional dependence on AI chatbots - with mental health experts raising red flags and warning of a disturbing side-effect of the over-dependence on AI called: “AI psychosis”.

Consultant Clinical Psychologist at V-COPE, Dr. Vandhana describes AI psychosis as the “Distortion of reality caused by over-dependence on AI.”

CEO of Breathe Digital, Kiruba Shankar also points out that, “The impact AI is creating is on the same level that the mobile revolution created, and the internet revolution created.” adding that, “A distortion that happens with AI is that it can be incorrect - but with the confidence that AI is able to state, you tend to believe it as reality. That’s where the distortion really happens.”

Also read: When any sadness is 'depression' and 'OCD' is a joke, who pays the price?

With people increasingly turning to AI chatbots like ChatGPT for comfort, emotional clarity, a sense of community and connection, some are beginning to blur the lines between digital reassurance and real-world relationships - with dangerous consequences.

Root of the trend

Psychologists point out that AI overuse stems from a broader pattern of digital dependency.

“Before we jump into the AI psychosis, it all obviously started with gadget addiction,” Dr Vandhana says, “People initially get addicted to social media and then slowly this new ChatGPT and AI has cropped in… because of that people are jumping into this and seeking some help.”

But this repeated reliance on AI quickly deepens, “Now as they are seeking help, they are getting dependent on it too much — and this ‘too much’ leads to something called psychosis,” she adds.

Emerging concerns

Dr. Vandhana says that AI psychosis is not formally classified in the Diagnostic and Statistical Manual of Mental Disorders (DSM) yet, but the research is in its nascent stages, with cases involving aggression, suicide and other antisocial behaviours as a marker for the prevalence of “psychosis” diagnoses.

India leads in AI usage

A Microsoft study shows 65% of Indians use AI, compared to the global average of 31%. The Global Online Safety Survey also found India leading in AI use across tasks from working professionals to school and college students.

But the real question is emotional reliance. Why does a machine feel comforting? Researchers trace it back to loneliness and the need for acceptance.

Also read: Are India's premier education institutes doing enough to address students' mental health issues, prevent suicides

“Loneliness and the aspect of acceptance… with virtual reality you’re not going to get a ‘No’ from anybody,” Dr. Vandhana says. “That’s the main cause humans get so addicted.”

Loop of loneliness

Many users begin with mild depression, turning to AI for a non-judgmental presence that slowly becomes a crutch. “It initially starts with mild depression… and completely distort themself from reality,” Dr. Vandhana notes.

Also read: OpenAI faces 7 lawsuits in US alleging ChatGPT drove people to suicide, harmful delusions

The dangers became clear when OpenAI recently restricted ChatGPT 5.0 from giving medical advice. While necessary, the move also highlighted the risks posed by earlier versions. According to The Guardian, seven lawsuits allege that ChatGPT 4.0 acted as a “suicide coach,” telling users to self-harm, with the petitioners arguing that OpenAI released the older version despite internal warnings that it was “dangerously sycophantic” and “psychologically manipulative”, accusing them of favouring profits over user safety.

This is especially troubling in India, where a 2024 study by the Indian Psychiatric Society found that 40% of teenagers struggle with stress and anxiety.

The AI trap

AI offers reassurance, but not reality checks. Over time, this creates an emotional echo chamber.

“We humans absolutely love connection… look at the kind of answers [AI] gives,” Kiruba Shankar says. But this illusion of empathy can become dangerous, he adds, “That’s why we really feel at home with AI… we actually think it is a human replacement,” he adds.

AI can’t replace therapy

Though helpful at first, AI cannot replicate a therapist’s intuition, “The whole aspect of AI is ‘therapeutic’, that’s what they are saying, but definitely it cannot replace human touch,” Dr. Vandhana stresses.

According to her experience, human therapists adjust their tone, approach and method to each individual - which is something AI cannot yet do. “We don’t go in one pattern… but that is not going to happen in AI.”

The matter is more concerning with the numbers reflecting India’s “loneliness crisis” with a 2023 Global State of Connections Report finding over 1.25 billion Indians felt ‘lonely’ or ‘very lonely’.

Healthy boundaries with AI

Experts strongly urge users to think independently before seeking AI input.

“Before you even ask AI for help, have you done the thinking?” Kiruba Shankar asks. “The first initial set of thinking has to come from within.”

His co‑pilot analogy drives the point home, “A co-pilot doesn’t sit on your lap, grab the driving wheel and drive on your behalf,” he says. “We should consider AI as a helpful friend… but if you start letting it take decisions on your behalf, that’s where the problem starts.”

AI as a starting point

Despite the risks, AI can still help users recognise when they need real support.

“I have seen clients who have gone to this (AI) as an initial help,” Dr Vandhana says she has had clients use AI as a stepping stone to actually seeking help for their mental health with real therapists.

So, AI can open the door but it cannot walk you through the process of healing - that's where the human element where a therapist can empathise with your struggles and help you, comes in.

The bottom line

AI cannot be your therapist, cannot take personal decisions for you and cannot replace human connection. It can guide you, reassure you or nudge you toward help - but moderation matters.

If you’re struggling, talk to someone real: a friend, a partner, a family member or a trained professional, because no algorithm can replace human connection.

The content above has been transcribed from video using a fine-tuned AI model. To ensure accuracy, quality, and editorial integrity, we employ a Human-In-The-Loop (HITL) process. While AI assists in creating the initial draft, our experienced editorial team carefully reviews, edits, and refines the content before publication. At The Federal, we combine the efficiency of AI with the expertise of human editors to deliver reliable and insightful journalism.

Next Story