top of page

Artificial Intelligence and Mental Health Therapy

It is estimated that 52% of all adults in the United States use artificial intelligence chatbots like ChatGPT, Claude, or Gemini, with usage continuing to increase. Reasons for this may include cost or information barriers to health care, reduced stigma and fear of judgment, and a distrust of the healthcare system. 

 

While chatbot tools can provide quick information about health issues, they are not a replacement for a therapist or medical professional. In October 2025, OpenAI released data indicating that more than a million ChatGPT users each week show “explicit indicators of potential suicidal planning or intent” during conversations with an AI chatbot. Using AI in place of a therapist or medical treatment can be potentially harmful to an individual and the people around them.

 
 
 

Recent Posts

See All
Why You Should Not Use AI for mental Health Therapy

Conversation isn’t the same as therapy.  Trained therapists can diagnose, reduce harm, and create a health plan that is tailored to your mental health needs.   Artificial intelligence can be dangerous

 
 
 

Comments


bottom of page