ADVERTISEMENT

AI Is A Tool, Not Your Friend Or Therapist

A caveat is that AI tools by themselves may not cause someone with no issues to harm themselves but when there are pre-existing issues, depending on AI tools for help will make things worse.

Artificial intelligence (AI)
Artificial intelligence (AI) (Photo: Envato)
Show Quick Read
Summary is AI Generated. Newsroom Reviewed

Would you chat with an AI chatbot as if the app were a close friend? I certainly won’t, despite my love for all things tech and being an early adopter on most tech trends. However, even if most are like me when it comes to AI chatbots and can’t imagine getting emotionally attached to an AI tool, evidence is mounting that some young people today use AI chatbots as friends and what’s worse, some actually use these bots for therapy. 

A recent LinkedIn post by a market researcher referred to a Gen Z group discussion where it became clear to her that this trend was quite prevalent, and which was later confirmed to her by a therapist. The therapist told her that users are turning to AI when they need to vent because AI bots don’t interrupt, don’t judge, don’t challenge and don’t expect a follow-up. The therapist explained the practice as "emotional outsourcing. Short-term relief, long-term detachment".

I felt that this warranted some digging into. Were people actually using AI bots for therapy despite the many warnings that come when you use these tools? I spoke to two experienced psychiatrists about the issue and whether they were coming across patients who used AI chatbots for therapy.

Atul Aswani, a psychiatrist who practices in Mumbai, said he is witnessing around 10% of his patients using AI chatbots for therapy or for checking medication and cross-questioning treatment regimens. And the number is growing. All of these patients are tech-savvy and young. “Some have told me they are asking ChatGPT the same questions they are asking me and want to see if I can provide a better answer,” he explained. 

Besides AI chatbots simulating the feeling of connection, he also believes another reason why some seek help from AI when they should be seeking professional help is the cost and effort to set up a medical consultation. And of course, the fact that there is not even a whiff of judgement when it comes to AI tools. And while these tools carry disclaimers about not them not being a replacement for professionals, Aswani said they were like the ones warning about smoking being injurious to help — smokers hooked on nicotine light up anyway.

Aswani is trying out various AI tools himself to understand them better and help his patients rather than demand they stay away from such tools. However, he is clear that the best AI tool is inferior to the most mediocre therapist when it comes to therapy.

Jamila Koshy, a senior psychiatrist from Bengaluru told me that another major issue with using AI tools for therapy is that there is a need for medication in some cases and anyone using these tools may miss out on actual treatment that may be urgently needed. 

Clearly this is an area of concern—it is one thing to use apps as an adjunct to mental health treatment, for things like reminders to exercise, journal, etc., but an altogether dangerous possibility to use AI tools for emotional outsourcing or worse, actual therapy. An important caveat to note is that AI tools by themselves may not cause someone with no issues to harm themselves but when there are pre-existing issues, depending on AI tools for help will make things worse.

Tech companies developing these tools will certainly display warnings and disclaimers but will self-regulation be enough or do medical bodies and regulators need to step in to ensure that there are effective guardrails to ensure that even if someone unwisely decides to use a chatbot for therapy, any signs of suicidal tendencies and the like will result in the tool not ignoring or just requesting the person to seek professional help but perhaps send out warnings to someone who could intervene.

This is a grey area and not easy to execute given that privacy is a right too, but it is important to build mechanisms that ensure care for a person who may be at risk. We can’t stop people from using tech for what it was not intended but when trillions are expected to be spent on developing AI tools surely there can be a sharper focus on guardrails too.

We have a story this week on the US administration taking this matter seriously and Sam Altman of OpenAI has also said that while people have used AI in self-destructive ways, if someone is in a mentally fragile state and prone to delusion, OpenAI does not want its AI tools like ChatGPT to reinforce that.

White House AI Chief is Worried AI Could Impact Your Mental Health 

Here are some of the other AI-related stories we covered in the past few days. Check the list to see if you missed any:

  • ChatGPT Go Vs ChatGPT Plus Vs ChatGPT Free: A Comparison

  • Decoding AI Trends And Regulatory Grid

  • India's Ability To Prepare For Tech-Driven Wars To Decide Country's Future: Gautam Adani At IIT Kharagpur

  • 'AI Will Make Us More Human' — Arundhati Bhattacharya Weighs In On Impact Of AI On Jobs, Livelihoods

  • China's DeepSeek Releases V3.1, Boosting AI Model's Capabilities

  • IgniteTech CEO Eric Vaughan Says, 'Was Necessary, Would Do It Again' After Firing 80% Employees

  • Ex-Twitter CEO Parag Agrawal Is Creating ‘Internet For AI’: All You Need To Know About His Startup Parallel

  • Anthropic Nears Funding At $170 Billion Valuation

  • Enterprises See AI's Main Value In Better Decision-Making Rather Than As An Automation Tool: Survey

  • OpenAI’s Sam Altman Expects To Spend ‘Trillions’ On Infrastructure

  • From Chatbots To Taskbots: What is Agentic AI And How Can It Help Businesses?

  • Trader Uses Perplexity Chatbot To Create F&O Strategy. Here's What Happened

  • Investing In AI Firms To Build Massive Wealth Or Wipe Out Most Capital? Indian Finfluencer Explains

  • 'AI-Generated Content Is Fine As Long As...,'Google Clarifies Stance

  • ChatGPT And Grok Comparison: Which Is Better? 

Till next week,

Ivor Soans. 

Disclaimer: The views expressed here are those of the author and do not necessarily represent the views of NDTV Profit or its editorial team.

OUR NEWSLETTERS
By signing up you agree to the Terms & Conditions of NDTV Profit