If you’re considering using AI as part of your mental health journey, it can be invaluable as a learning tool, but we would all do well to remember the risks. I’m heartened by the survey results that show this:
Despite the utility of these tools for office management, the survey highlighted deep reservations about their safety.
The risks are many. Whether your very personal details are being kept private or could potentially be part of a data breach is a considerable risk. Hallucinations are a thing that I see in my professional field as well. Of course, there is also the reality that AI tools can be biased and base their responses on inaccuracies in the models. We’ve seen too many stories of people being led to disastrous results by chatbots. It’s risky.
I work with AI professionally. I use it to get things done and to support research, but I never trust or depend on it. It’s a tool. For mental health, it can also be a tool, and I’m sure many of you are finding it helpful. I would caution all of us to be careful, though. Mental health professionals have serious reservations; I would keep them in mind.

