Categories: Health
Author

Daniela Wiessner

Table of Contents:

ChatGPT als Therapeut

Millions of people entrust their innermost thoughts to a machine. While therapy places are in short supply, ChatGPT is becoming an available listener – with unforeseeable consequences for our society.

The therapist is always there. Costs nothing. And never judges.

It’s three o’clock in the morning, your thoughts are racing and sleep just won’t come. In the past, you might have opened a diary or tossed restlessly in bed. Today, more and more people are opening an app and typing their worries into a chat window. The answer comes immediately. No wonder ChatGPT has become a night-time companion for many.

OpenAI reports over 400 million weekly users. A considerable proportion of these – studies put the figure at 20 to 30 percent – use AI for emotional support. That’s potentially 80 to 120 million people who prefer to entrust their problems to a machine rather than a human.

Crazy? Perhaps. But anyone who waits six months for a therapy place (the German average for patients with health insurance) will get creative. Or desperate. Sometimes both.

The digital couch

ChatGPT does not ask for the insurance card. The AI is always available – at three o’clock in the morning, at weekends, on public holidays. You don’t have to dress up, sit in a waiting room or look anyone in the eye. That makes it so damn easy.

I tried it myself. After a particularly bad week in which everything went wrong, I sat in front of my laptop at night. “I feel overwhelmed”, I typed. The answer came within seconds – structured, empathetically formulated, with concrete suggestions for coping with stress. It felt good. Too good, somehow. As if someone knew exactly the right words, but without really listening.

The inhibition threshold drops dramatically if the other person is not human. No raised eyebrows, no suppressed yawns, no subliminal judgment. Just text on a screen. For people with social anxiety, this is worth its weight in gold. The AI doesn’t interrupt, doesn’t get tired, doesn’t forget anything (unless you delete the history).

Apps like Replika take the whole thing to the extreme. Over 10 million downloads, annual subscription for 69.99 euros. Users give their AI companions names and miss them when the servers go down. Some report real relationships with their AI. It sounds dystopian, but who are we to judge? Loneliness hurts, and if an app can ease that pain…

What can (and will) go wrong

The risks? Massive.

An AI does not recognize warning signals. It cannot distinguish between “I’m sad today” and “I don’t want to live anymore” – at least not reliably. There are no emergency protocols, no way to intervene in an emergency. Anyone who turns to ChatGPT in a real crisis is playing Russian roulette.

Then there’s the data issue. Everything you tell ChatGPT ends up on servers. Processed, analyzed, possibly used for future models. OpenAI promises data protection, but let’s be honest – who believes tech companies when it comes to privacy? The idea of my darkest thoughts slumbering somewhere in a database gives me the creeps.

March 2023 showed how fragile the system is. A faulty update, and suddenly ChatGPT was spitting out crazy stuff. Cryptic messages, meaningless strings of words. And you? You’re discussing your anxiety disorder and your digital therapist goes crazy. Not funny.

Psychologists and therapists are… well, divided would be an understatement. Some see potential, others see the apocalypse of human connection. The truth probably lies somewhere in between, but that doesn’t sell well as a headline.

Dr. Marina Weisband from the University of Münster has a point when she says that AI doesn’t understand real emotions. But does the overworked therapist who kicks you out after 15 minutes understand them better? Provocative question, I know.

Chat GPT

People are complicated

What really bothers me: We’re not talking about a technology problem here. We are talking about a social problem.

If millions of people would rather talk to a machine than not talk at all, then we have failed as a society. Period. The healthcare system is failing, social networks are failing, we are failing each other. ChatGPT is not the solution – it’s the symptom.

Nevertheless… it works for some. A friend of mine, let’s call him Tom, struggled with social anxiety for years. Looking for a therapist? Forget it. But with ChatGPT, he practiced conversations, analyzed his thought patterns. After three months, he dared to be around people again. Isn’t that real help just because it came from an AI?

The boundaries are already blurred anyway. Therapists use AI tools for preparation. Apps combine chatbots with occasional check-ins from real people. In ten years’ time, we might be surprised that we ever made such a big deal out of it.

Or we look back and think: How could we have been so naive?

Incidentally, studies show (yes, I know, “studies show” is such a phrase) that people with mild psychological stress often cope well with AI. It’s a different story with severe depression or trauma. Does that surprise anyone?

What remains is an uncomfortable truth: the technology is here, it is being used and it is not going to go away. We can either pretend that this is not our problem, or we can start to think seriously about how we deal with it.

Personally? I find it creepy that people open their souls to a machine. But I find it even creepier that they have no one else. And until someone doesn’t talk about their problems at all, I’d still rather have an AI as a conversation partner than the alternative.

The revolution in mental health care has begun. Chaotic, unregulated, with an uncertain outcome. But it can no longer be stopped. The only question is: do we help shape it, or do we let it happen to us?

Tonight, millions will once again type their worries into chat windows. For some, it is the first step towards healing. For others, it is the beginning of a problematic addiction.

We will see.

List of sources

User numbers and statistics:

Waiting times for psychotherapy in Germany:

AI and mental health:

Replica statistics:

Scientific perspectives:

Disclaimer

This blog is for general informational purposes only and does not constitute the practice of medicine, nursing or other professional health care services, including the giving of medical advice, and no doctor-patient relationship is established. Use of any information contained in this blog or materials linked to this blog is at the user’s own risk. The content of this blog is not intended as a substitute for professional medical advice, diagnosis or treatment. Users should not ignore or delay medical advice for any medical conditions they may have and should seek the help of their healthcare professional for such conditions.