top of page
Search

Should AI Be Used in Therapy?


Robot hand reaching out to human hand

Artificial intelligence (AI) is changing many aspects of our lives, and therapy is no exception. The intent of using AI in therapy is to improve the therapeutic process for clients and counselors, rather than act as a replacement for mental health support from a trained professional.


Research indicates that AI can make therapeutic resources more convenient, personalized, and accessible. These benefits led Lottena Wolters – a Licensed Professional Counselor and CEO of The FL Wolters Group, PLLC – to begin incorporating AI-driven tools into our practice. However, before we could start using AI, it was crucial to make sure we understood the concept and followed the relevant ethical and legal guidelines.


How Does AI Complement Therapy?

AI tools like machine learning and a concept known as natural language processing (basically the ability of machines to use human language) enable AI systems to process large amounts of data, which can support diagnosis, treatment, and administration. Various AI apps, chatbots, and other virtual assistants can be effective when used alongside traditional therapy, often providing mental health professionals with additional analysis, resources and benefits, including the following:


  • assisting human therapists with administrative tasks (such as scheduling and documentation), freeing up more time for client interaction and reducing the risk of burnout;

  • providing clients with 24/7 support, helping to monitor and improve coping practices, mood changes, and overall progress between therapy sessions, and;

  • allowing individuals, who may otherwise face financial or other barriers, to receive accessible and affordable mental health support from AI tools.


Lottena Wolters (LPC) recently began using the AI platform Blueprint with some of her clients to help with her clinical documentation. When asked how using AI has changed her therapeutic process, Lottena shared that the platform “cuts note time by 10-25 minutes per note, allowing more time for clinical follow up, session planning, and homework planning.” She indicated that AI can assist both clients and clinicians with tracking homework and reviewing sessions.


Can AI Replace Human Therapists?

Many people have expressed concerns that AI tools might eventually take over therapeutic roles, but the American Counseling Association (ACA) stresses that AI should never replace the judgment of human counselors in guiding therapeutic decisions. Even though AI apps and chatbots can provide mental health support between therapy sessions, these tools demonstrate significant limitations that prevent them from replacing human therapists. Some of these limitations include:


  • The therapeutic alliance between counselor and client is central to treatment success, and AI systems cannot replicate this type of genuine, empathetic relationship.

  • Data misinterpretations by AI tools can lead to inaccurate assessments, diagnoses, and treatment recommendations, potentially causing more harm than good in therapy.

  • AI platforms rely on large data sets that may include biased information, which can lead to inappropriate recommendations or responses for clients.


Research suggests that the therapeutic relationship between counselor and client is the most crucial factor impacting treatment retention and success in psychotherapy. Modern AI tools lack the ability to build authentic relationships with clients, nor can they fully understand the complex, nuanced emotions that often arise in therapy. Lottena Wolters agreed with the importance of the therapeutic alliance and stated that “I don’t feel like that can be replaced.”


What Are the Ethics of Using AI?

Mental health professionals need to make sure AI platforms meet clinical needs and ethical standards in order to reduce the risks of using these tools while keeping the associated benefits. For example, general AI tools such as ChatGPT should not be used in therapy because they fail to comply with privacy regulations like HIPAA to ensure client data is encrypted and protected. AI platforms like TheraPro, Mentalyc, and Blueprint are specifically designed to help maintain patient confidentiality, but these tools still need to be consistently monitored by human therapists to make sure other ethical and legal guidelines are met, some of which are listed below:


  • Therapists must obtain comprehensive informed consent from clients before using AI tools as part of their therapy.

  • Routine audits and security updates are necessary to reduce the risk of privacy violations.

  • Therapists must regularly evaluate AI tools for possible biases and mistakes that could interfere with appropriate diagnosis and treatment.

  • As AI-driven tools continue to develop, therapists must stay informed of relevant changes through continuing education.


Lottena Wolters (LPC) currently uses Blueprint and has recommended the platform to other clinicians at The FL Wolters Group, PLLC. In addition to reducing the time required for clinical documentation, AI has also benefited other areas of our practice. Supervision meetings are sometimes recorded and transcribed by AI, and clinicians have used AI tools to help with session  planning, intervention recommendations, and treatment preparation/followup. Before deciding to incorporate AI into the practice, Lottena discussed relevant ethical and legal guidelines with team members, and made sure clinicians were sharing the same details with their clients.


What Are Your Thoughts About AI In Therapy?

With the right tools and oversight, AI can complement traditional therapy. While it can make therapeutic tools more convenient for clients and help therapists reduce burnout, it must be used carefully to avoid ethical violations and any loss of human connection that is so critical to treatment.


How would you feel as a therapy client about AI supporting your mental health journey? Do you believe it would enhance your experience, or do you have reservations? Join the conversation about the future of therapy and help our practice shape the way we use AI! Share your thoughts in the comments below or contact us directly to let us know or ask questions.


 

Sources:

American Counseling Association. (2014). ACA code of ethics. Alexandria, VA.

Harper, R. (2024, April 8). AI will make mental healthcare more human. Psychology Today.

Jeyaraman, M., Balaji, S., Jeyaraman, N., & Yadav, S. (2023). Unraveling the ethical enigma: Artificial intelligence in healthcare. Cureus, 15(8).

National Board for Certified Counselors. (2024). Ethical principles for artificial intelligence in counseling

Olawade, D. B., Wada, O. Z., Odetayo, A., David-Olawade, A. C., Asaolu, F., & Eberhardt, J. (2024). Enhancing mental health with artificial intelligence: Current trends and future prospects. Journal of Medicine, Surgery, and Public Health, 3(1).

Thakkar, A., Gupta, A., & De Sousa, A. (2024). Artificial intelligence in positive mental health: A narrative review. Frontiers in Digital Health, 6(1).

Comments


bottom of page