I read in the Wall Street Journal about Artificial Intelligence software programs now doing therapy for mental illness. With a bit of exploration, I found that A.I is also treating addiction. I am skeptical that A.I. will be a successful approach to treating a disease that we know to be “cunning, baffling, and powerful”.
A.I. Therapy is growing fast because it can be immensely profitable. If I wrote a chat GPT program that sounds like me, The RecoveryRobot could treat hundreds of people at a time, charging the, say $40 each, but the sky is the limit on how much I can earn. Also, I don’t have to spend any time with people, actually listening to them. I just have to teach the RecoveryRobot basic skills of empathy and encouragement, and some facts about addiction and recovery.
Back in 1981, I went into the Mascoma Bank, in Enfield, New Hampshire. It was a small bank that had just a few branches. I knew the tellers there. One day, they were all laughing a lot, and I discovered that they had loaded a software program called “Psychiatrist”. No matter what they typed into their computers, the “Psychiatrist” software would respond with “minimal encouragement” such as “tell me more about that” or “how long have you felt that way” or “what would it take for you to make your goal come true.” No matter how hateful or bizarre their input, the “psychiatrist” would respond with gentle encouragement to explore their thoughts further.
This all might be amusing, except the article went on to describe a program called “Sonny” that is now being bought by schools, to substitute for school counselors. Sonny engages troubled students online to share their thoughts, fears, and problems with the AI computer. The creators claim that 80% of the students report getting better, but what about the other 20%, especially in schools that have no human counselor to follow up with them?
The biggest problem with AI counseling is that most of it is written to gently affirm the user and encourage them to explore their thoughts and feelings. That’s ok, but not if the user is suicidal. A live therapist would not gently affirm feelings of suicide and explore the best ways of killing oneself. A robot that is programmed to be affirming in all cases would.
As an alcoholic and addict, I need others of my kind who can rapidly detect and challenge sophisticated bullshit and challenge me. The big book says that there are those with grave emotional and mental illnesses, but they too can recover if they have the capacity to be honest.
In 2010, long dormant trauma came back to life in me. I couldn’t sleep and became suicidal. My home group in AA busted me on my thinking and decided for me that I had to go to a psychiatrist because I wanted to kill myself. I said no. They insisted yes. They renamed the group, which had been called “The Gratitude Group” as the “Shut up and get in the car group.” When I wouldn’t go to AA, they said “Shut up and get in the car” as they dragged me to meetings. They dragged me to a psychiatrist, who put me on important anti-psychotic medication. I stopped wanting to kill myself, and I’m still on those meds, 15 years later.
If the AA group had given me warm, empathic agreement with my feelings, I would have killed myself. Their no bullshit confrontation saved my life. It was the right thing at the right time. I could have fooled ChatGPT, but I could not fool a room full of sober drunks. God worked through them to get me what I needed.
Our book says: Remember, we deal with alcohol, cunning, baffling, and powerful. Without help it is too much for us. But there is One who has all power. That one is God. May you find him now.
Today I have another home group, in Alamo Heights, Texas, but the same power of God is there as we protect each other.
Back in 1981, I went into the Mascoma Bank, in Enfield, New Hampshire. It was a small bank that had just a few branches. I knew the tellers there. One day, they were all laughing a lot, and I discovered that they had loaded a software program called “Psychiatrist”. No matter what they typed into their computers, the “Psychiatrist” software would respond with “minimal encouragement” such as “tell me more about that” or “how long have you felt that way” or “what would it take for you to make your goal come true.” No matter how hateful or bizarre their input, the “psychiatrist” would respond with gentle encouragement to explore their thoughts further.
This all might be amusing, except the article went on to describe a program called “Sonny” that is now being bought by schools, to substitute for school counselors. Sonny engages troubled students online to share their thoughts, fears, and problems with the AI computer. The creators claim that 80% of the students report getting better, but what about the other 20%, especially in schools that have no human counselor to follow up with them?
The biggest problem with AI counseling is that most of it is written to gently affirm the user and encourage them to explore their thoughts and feelings. That’s ok, but not if the user is suicidal. A live therapist would not gently affirm feelings of suicide and explore the best ways of killing oneself. A robot that is programmed to be affirming in all cases would.
As an alcoholic and addict, I need others of my kind who can rapidly detect and challenge sophisticated bullshit and challenge me. The big book says that there are those with grave emotional and mental illnesses, but they too can recover if they have the capacity to be honest.
In 2010, long dormant trauma came back to life in me. I couldn’t sleep and became suicidal. My home group in AA busted me on my thinking and decided for me that I had to go to a psychiatrist because I wanted to kill myself. I said no. They insisted yes. They renamed the group, which had been called “The Gratitude Group” as the “Shut up and get in the car group.” When I wouldn’t go to AA, they said “Shut up and get in the car” as they dragged me to meetings. They dragged me to a psychiatrist, who put me on important anti-psychotic medication. I stopped wanting to kill myself, and I’m still on those meds, 15 years later.
If the AA group had given me warm, empathic agreement with my feelings, I would have killed myself. Their no bullshit confrontation saved my life. It was the right thing at the right time. I could have fooled ChatGPT, but I could not fool a room full of sober drunks. God worked through them to get me what I needed.
Our book says: Remember, we deal with alcohol, cunning, baffling, and powerful. Without help it is too much for us. But there is One who has all power. That one is God. May you find him now.
Today I have another home group, in Alamo Heights, Texas, but the same power of God is there as we protect each other.