Blog

31 Jul, 2025
Does AI introduce New Mental Health Challenges?

It started with a notification at 2:14 a.m. 

"Hey, I noticed your sleep pattern changed. Are you okay?"

Riya blinked at her phone - her mental health app was checking in on her. Nice, right? Except... she hadn’t shared that she was feeling down. 

The app had inferred it from her phone usage - late-night scrolls, fewer steps, shorter messages. It was strikingly insightful. And freaky.

AI and Mental Health: A Beautiful Partnership?

AI is making progress in mental health care, for example, using machine learning in therapy bots and predictions in diagnoses, allowing more access to emotional support than ever before. In 2025, Dartmouth College studied a group of 106 participants who had either major depressive disorder, generalized anxiety, or eating disorders. 

The study looked at a therapy bot called Therabot. To make a long story short, those who used Therabot showed a 51% reduction in depression symptoms, a 31% reduction in anxiety, and a 19% improvement in eating-disordered concerns, considered to have about the same effect as outpatient care.

It gets better! Participants reported feeling genuine emotional connections with Therabot; it was accessible 24/7, was not judgmental, and offered reminders as nudges, so participants experienced it like having a friend online who is sort of wise and totally does not write off treatments like therapy.

According to Appinventiv, AI is creating wearables that help predict anxiety attacks, creating sleep monitors that trigger recommendations, or distributing apps based on CBT, such as Wysa or Woebot, and there is also EVA, which is powered by IBM Watson and offers decision support for clinicians and peer support for their clients.

That's the positive side. However, here is the nerve-wracking part of the entire system: while AI is helping many of us feel better, it might also be making some of us more anxious in the long run. 

The New Face of Digital Stress

AI doesn’t just treat mental health; it shapes how we experience it. And not always for the better. Here are some of the newer, sneakier stressors AI brings along:

1. Algorithmic Anxiety

Riya’s experience isn’t uncommon - many technology users feel monitored even by products made to support them. If a chatbot or application seems to know an incredible amount of information about you very quickly, it can be uncomfortable, and may even lead users to second-guess their actions, wonder about what data may be collected, or even worse wonder whether they've been judged silently.

While the users in the Dartmouth study were able to bond emotionally with the Therabot, the illusion of empathy has a threshold, and if there are too many scripted responses or the robot sounds too robotic, users can feel insecure and sometimes lonely after talking with it.

2. Users become dependent on AI Tools

One of the benefits for a person in distress is that the AI listens immediately. But what happens if the Wi-Fi isn't working? What if the AI misinterprets an expression of distress as a cry for help? The researchers at Dartmouth included a human clinician with the Therabot on purpose because AI cannot yet detect risk in the same way a trained and licensed mental health professional can. (Dartmouth News, 2025).

3. Data privacy issues

Mental health is personal. AI technology collects a lot of personal data: daily habits, sleep, and mood, and sometimes the user does not know where that data is going and who may have access to it, for some users, that creates anxiety.

4. Digital Diagnosis Bias

AI learns from what it’s fed. If the training data is skewed, say, toward urban, English-speaking, white populations, the system may fail to understand other groups. According to a 2024 study on arXiv, these gaps can lead to harmful misdiagnoses and reinforce systemic inequities in mental health care.

A Personal Reflection: When the App Got It Wrong

Let's return to Riya for a second. A 25-year- old woman, living by herself in a new city, doing the best she can to balance work and a long-distance relationship, and continuing to explore therapy options.

When she stumbled upon a highly-rated mental health app, it was like a miracle. The app responded quickly to inquiries, provided CBT strategies, and even had a check-in reminder feature. One night, she wrote, "I feel like I don't matter."

The bot responded with, "Have you thought about going for a walk?"

Umm, Lack of Emotional Intelligence, after all.

The bot's response was not cruel; it was undertrained (you may say it was inexperienced). Riya did not want a productivity suggestion; she wanted a soft pause, a digital hug. She deleted the app and called a friend.

So, Where Do We Go from Here?

The role of AI in mental health is not all dark and stormy. If anything, it's evidence that technology can fill the void of traditional systems, but we need to tread lightly.

Here are a few of the things we need to remember:

  • AI is a tool, not a therapist. It has the potential to supplement therapeutic practice, but not to replace it in its entirety.
  • Data privacy should be a priority and taken seriously. Transparency will be important here.
  • Cultural inclusivity is important. Algorithms need diverse representation in input if they are to provide equal services to everyone.
  • Regular audits are essential. Mental health is evolving, which means our AI tools must evolve too. .

 

A New Concern: AI-Generated Misinformation

According to a 2025 investigation by Reuters, generative AI chatbots can be manipulated into giving false health advice, plausible, confident, and sometimes dangerously wrong. In a mental health context, that’s not just misleading. It’s potentially harmful.

The Human Touch Still Matters

We all want to be seen and heard, not just predicted.

AI is giving voice to those who would otherwise go unheard. It decreases stigma, enables support on a wider scale, and helps activate early interventions. But it cannot hold space for grief. It cannot share a laugh with you after a good cry. And it certainly cannot say, "I have been there too."

Riya's story is not a cautionary tale but a reminder. That even with AI and intelligent machines, the best healing possible will always occur through connection with other humans, whether on the end of a late-night call made by a friend, within the warm silence with a therapist, or simply saying, "I am not okay" as someone listens.

22 Mar, 2021
How do you know social anxiety is a thing?

Did you know social anxiety is a thing? What is social anxiety. A chronic mental health condition in which social interactions cause irrational anxiety.

17 Apr, 2021
What did 2021 teach us?

In case you didn't realise, know before 2021 ends. 2021 taught us almost every household chores. 2021 made us fall in love with ludo all over again.

11 May, 2021
Work from home vs Work from office?

The topic is still heated. The debate is still on! Work from home or work from office?