I remember feeling blue one morning, and there was no desire to leave the bed. That day dragged through, and I kept feeling the longing to die — no, I didn’t want to die, but I felt like it. I opened up my ChatGPT and typed, “Hi, I feel like dying.”
The fear that AI would cause job losses existed before AI and AI chatbots became mainstream. While this was a valid concern, it seems even more relevant now. People are now asking AI questions and assigning it tasks for personal assistants, event planners, advisors, and even therapists.
Over the years, a loneliness epidemic has spread while we glue ourselves to our screens. Although we can blame capitalism for this, the link between them isn’t the main point here. Have you ever asked an AI model to help you with an existential crisis?
A 2024 study from the National Library of Medicine found that about 28% of people surveyed have used AI for “quick support and as a personal therapist.”
Based on the Market.us report overview, the global AI in Mental Health Market will reach $14.89 billion by 2033. This is a significant increase from $0.92 billion in 2023, as projected by analysts.
All indicators point that AI in mental health is experiencing atomic growth, and a concerning facts that mental health issues are ever growing more permanent.
With innovative solutions to address the challenges in mental healthcare, AI is being used as a tool for conversation and therapy. This is happening because it’s free from social stigma, is available, and is much cheaper than traditional therapy.
However, a major concern is that AI can be overly agreeable, a phenomenon called sycophancy. This means the AI may agree with a user’s beliefs instead of providing an objective or helpful perspective. This can lead to the AI confirming false information, creating echo chambers, and making it less reliable.
Scientific groups, such as the American Psychological Association, have concerns about using AI for therapy. Reliance upon AI counsel has been reported to yield adverse outcomes, such as violence and suicide.
Despite these risks, I know people will continue to use AI models for therapy. Which is why a new AI model called ARCARAE caught my attention. It’s designed to connect a person’s inner world with the outside world through self-reflection and science. The rage bait on X (Twitter) drew me to it. Someone had asked if VCs knew how to spend money, and another user said it was an LLM for people without internal dialogue.
Arcarae is an AI model designed to shape the future of cognitive intelligence. It addresses three critical AI failures: sycophancy, attention deficits and inconsistent prioritization. My interaction with the model shows it’s not just AI therapy. It works as a tool, as the founder said, to bridge the gap between a person’s internal and external worlds. This helps them remember who they are and create unique things.
I haven’t used this model extensively to see its drawbacks, but I’m sure there are some. An overreliance on an AI model to guide your cognitive thinking would, in the long run, increase the chances of being buried in augmented or virtual reality.
Share Dialog
Abraham Adenuga Jones
Support dialog