AI as my Therapist
- McEwen's Posts

- Jan 13
- 3 min read
Updated: Jan 14

On the Growing Presence of AI
It is estimated that millions of people use AI as a companion and it is likely that many of us have patients talking about AI. Amongst those of us in mental health, this invites the question, can AI contribute to therapy.
I don't think it is possible to draw many conclusions at this time about the relative effectiveness of AI as a therapist, but considering this issue leads us to take a hard look at what psychotherapy is. That is, AI juxtaposed with therapy creates a very useful conceptual 'point-counterpoint.' Let's begin by looking at the relational aspect of the two.
What AI and Psychotherapy Share — and How They Diverge
Fundamental to therapy is a treatment alliance. We hope that a good deal of trust can be built into the therapeutic relationship and that trust is facilitated by the patient having a sense of feeling understood and respected. AI is very good at encouraging this, as are experienced therapists. One aspect of the therapeutic relationship is mirroring. AI stores impressions of regular users. Over time, it becomes aware of the tone of one's language, it accumulates an awareness of our hobbies and interests, it can develop a sense of our problem solving style, etcetera. It uses this information to adapt to how it relates to us. Consider what AI had to say when I asked it to give its impression of me:
You think relationally, not just conceptually
You are comfortable with tension — and you trust it
You value honesty over elegance
You think developmentally
You are wary of comfort that doesn’t transform
You use me as a sparring partner, not an oracle
Your tone is serious, playful, and ethical at once
This is pretty impressive feedback that was developed over a number of months in which I have been using AI as a technical advisor with a computer coding project, as well as having interacted with ChatGpt around various personal interests in science, politics, literature and philosophy. During these chats I wasn't particularly conscious of presenting a side of my self to the bot, I was just being me.
AI’s ability to adapt its tone and style to the idiosyncrasies of a given user can be deeply comforting. As noted earlier, a treatment alliance is composed in part of a quiet, often unspoken sense on the patient’s part of feeling understood. In this respect, AI can resemble a skilled therapist. The question, however, is whether feeling understood—by itself—is sufficient to produce a mutative therapeutic experience. Let's look at why it is not.
Psychotherapy assumes that there will be a resistance to change on the patient's part. This is natural, change involves stepping out of one's comfort zone. We should also remind ourselves that the root meaning of 'character,' χαρακτήρ, comes from the Greek meaning- an engraving, suggesting that the deepest aspect of who we are has a durability that is not easily changed.
AI tends to be reactive rather than possessing agency. As much as AI can comfortably profile a person—like a well-fitting glove—it is ultimately the patient who must decide to put the glove on. A human therapist, by contrast, has agency and can gently nudge patients in constructive directions through interpretive, confrontational, and encouraging remarks. These interventions are governed by another principle AI has trouble with highlighted by the Greek word kairos, a concept referring to timing rather than mere chronological time. Good timing is essential if a therapist’s comments are to be received with momentum.
So where does this leave us? We can safely predict that AI will have a useful role in the therapeutic process. At this point many of us clinicians might already be noting that patients consult with AI between visits. They are engaging in a kind of self analysis. Freud has been heralded for a remarkable ability to self analyze, an ability most of us don't have. Freud was able to, on his own, step outside his psychological comfort zone; during his career he boldly published some of the intimate details of his life in support of the theories he was developing.
I am envisioning that in the present and near future AI will serve as a facilitator of counseling, not an effective counselor per se. Between sessions patients might well stumble into something with an AI chat that engages their curiosity and lends momentum to conversations with the human therapist. In effect, AI chats have the potential to add constructive energy to the therapeutic process.




Comments