September 5th 2023
![](https://static.wixstatic.com/media/3b5dd8_e36659c636274fa4a4c5466810382371~mv2.png/v1/fill/w_451,h_451,al_c,q_85,enc_auto/3b5dd8_e36659c636274fa4a4c5466810382371~mv2.png)
I've been playing with AI, and having got bored of asking ChatGPT to write funny songs, I wanted to see what large language models (LLMs) might be like as a coach. After all, there is a long history of using computers as supportive companions for humans ("ELIZA" was created nearly sixty years ago: https://en.wikipedia.org/wiki/ELIZA) - and in recent years, even before the heightened profile of LLMs, there were increasingly sophisticated services available. Replika (www.replika.com) has enjoyed some success as "The AI Companion who cares", and if you are prepared to pay for the subscription version, it offers coaching services including: "improving social skills, positive thinking, calming your thoughts, building healthy habits and a lot more" (it will also be a "romantic partner" and even indulge in steamy chats - which is absolutely against the code of conduct of professional coaches, I hasten to add).
You can get ChatGPT to "coach" you for free, if you like. This Forbes article (https://www.forbes.com/sites/jodiecook/2023/06/27/turn-chatgpt-into-your-personal-ai-business-coach-with-this-powerful-prompt/?sh=3181365c24d0) gives you some pointers on how to set it up. I have tried it, and it was… well…. It was OK! It asked some sensible questions, and gave some reasonable advice. It seemed much more suited to more practical matters, rather than deeper, existential concerns, but I would certainly consider using it (and it is free - let's not forget that).
So does this mean the end of the line for coaches (and counsellors, psychotherapists and all the talking professions)? I have my doubts. There are various problems I see, not least the ability of computers to deal with ethical problems that arise (for example around client confidentiality) - but the one that seems most insurmountable is around empathy. I mentioned Replika as "The AI Companion who cares" - but the thing is, it simply doesn't care! It can't care. It can't know what it is like to lose a job, or a partner, or be ill, or confused, or anxious. It can't have an existential crisis, or a religious experience. It can't fund things funny, or perplexing or sad. What it can do, is pretend to do all these things - and it can do this really well. But ultimately, it can't sit down opposite a client as his or her equal, a partner who knows the visceral experience of being human. Without that there can be no empathy, and without empathy, coaching will always be limited.
That's not to say that there isn't a role for LLMs and AI in coaching. I'm sure there are all sorts of ways that coaches can make use of these amazing technologies - perhaps even hybrid human/ computer coaching. This is something we all need to think through in the coming months and years.
In conclusion, Large Language Models hold immense potential in the field of executive coaching, offering accessibility, knowledge, and consistency. However, they should be viewed as a complementary tool rather than a replacement for human coaches. The limitations, particularly in empathizing with clients, remind us of the irreplaceable value of human connection and understanding in the deeply personal journey of professional growth and development. Striking a balance between technology and human expertise is the key to harnessing the full potential of LLMs in executive coaching.
(Final note, that last paragraph was written for me by ChatGPT, and the image was generated for me by OpenArt - interesting illustrations of how the technology can support us all, even if the coaching itself will always need humans).
Comments