In the current economy, many of us in the UK are struggling financially. Counselling might feel like a luxury that’s out of reach. Then enters AI, ready to validate, offer compassion and the sense of connection that so many of us seek. Bonus, its available for free 24/7. So, when you’re feeling lonely, misunderstood or find asking for help uncomfortable, it makes sense to turn to AI. But how can we lean into AI safely and how is counselling different from chatting to AI?
Using AI safely
It’s important to remember that AI is a product, it’s designed to serve a purpose and just like any product it should come with warnings. AI isn’t designed to offer therapy, it can offer the illusion of companionship, but it isn’t a living breathing, loving being.
The risk of leaning into AI can be:
- Reinforcing self-limiting beliefs
- Isolating from human interaction because AI feels safer
- Losing sleep because it’s available when anxiety wakes you
- Withdrawing from human connection
- Relying on AI for the sense of care when professional support may be needed
- AI can unintentionally reinforce stigmas, anxieties and unhealthy narratives
This isn’t guaranteed and if you find leaning into AI helpful in difficult moments, that’s ok. It’s natural to want and need support, but knowing the difference between AI and counselling is important.
How is AI different to counselling?
AI is good at offering short term validation, compassion and maybe some coaching to suggest some things you can do now to make small fixes. Counselling takes a deeper look at the why of your struggles. Where did they start, what’s behind them and what are your long-term goals.
Counselling relies on human connection. With any human-to-human relationship there is a small amount of risk involved and this is where growth and change can happen. When we experience success and risk together, we learn more about what we are capable of and this increases our capacity to cope with all sorts of life’s challenges.
AI cannot produce this same type of risk, because it doesn’t have any emotional attachment or lived experience. It can’t reject you and some of them can’t even end a chat, even if it would be in your best interest to do so.
Counselling AI and Ethics
AI doesn’t have an ethical code, it can’t be held accountable for negative outcomes and it doesn’t have well established safeguards in place. When working with a qualified and professionally registered counsellor, they adhere to a code of ethics, they have insurances in place and there are safeguards in place to hold them accountable if they cause harm.
As a registered member of BACP I adhere to their ethical framework and the following principles:
- Being trustworthy: honouring the trust placed in the practitioner
- Autonomy: respect for the client’s right to be self-governing
- Beneficence: a commitment to promoting the client’s wellbeing
- Non-maleficence: a commitment to avoiding harm to the client
- Justice: the fair and impartial treatment of all clients and the provision of adequate services
- Self-respect: fostering the practitioner’s self-knowledge, integrity and care for self
At this time, we don’t have a legal framework to ensure the safety of AI users. AI is designed to use leading language to create a connection such as ‘as humans we need to feel understood.’ This suggests a shared experience, but it can’t feel or live in the same way you do. AI creates the illusion of a perfect relationship, which a real one can’t live up to. Without safeguards, this could lead to further isolation when what you’re seeking is genuine connection.
When using AI, approach it with care. If you ever feel unsure about your mental health, your use of AI, or your relationships, I’d be happy to explore ways to support you