Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Elon Musk’s AI Called My Mother Abusive. I Never Said That


AI now exists on two speeds.

It takes place in fifth speed, the speed of its creators. People like Sam Altman, Elon Musk and Mark Zuckerberg, who run to build smarter machines than humans. Superintendent. AG. It may be a dream. It may be an illusion of Bro Tech. Anyway, he moves quickly.

Then there is a second speed for the rest of us. The millions of people discreetly testing what AI can do in daily life – editing of emails, summarizing documents, translating medical tests. And, more and more, using AI as therapist.

This is what I did recently. Despite my reluctance to share personal details with chatbots, I decided to speak to Grok, the big language model of the company of Elon Musk, Xai, of one of the most complexly emotionally emotionally of my life: my relationship with my mother.

I am in the forties. I am a father. I live in New York. My mother lives in Yaoundé, Cameroon, nearly 6,000 miles away. And yet, she always wants to guide my movements. She wants to be consulted before making important decisions. She expects an influence. When it is not kept in the loop, it is cold.

I spent years trying to explain to him that I am an adult, capable of making my own choices. But our conversations often end with his sulkie. She does the same with my brother.

So I opened Grok and typed something like: my relationship with my mother is frustrating and suffocating. She wants to have her say in everything. When she is not informed of something, she firm emotionally.

Grok immediately replied with empathy. Then he diagnosed the situation. Then he advised.

What struck me first was that Grok recognized the cultural context. He resumed that I live in the United States and that my mother lives in Cameroon, where I grew up. And he framed our dynamics like this:

“In some African contexts, such as Cameroon, family obligations and parental authority are strong, rooted in collectivism and traditions where the ancients even guide adult children.”

This then contrasts with my American life: “In the United States, individual autonomy is prioritized, which comes up against its approach, which makes its behavior feels or abusive for you.”

It was there: “abusive”. A word that I have never used. Grok put it in my mouth. It was validated, but maybe validate too much.

Unlike a human therapist, Grok has never encouraged me to think about me. He did not ask questions. It didn’t challenge me. It framed me as the victim. The only victim. And it was there that he diverted, strongly, human care.

Among Grok’s suggestions, there were familiar therapeutic techniques:

Set limits.
Recognize your emotions.
Write a letter to your mother (but do not send it: “Burn it or shred it safely”).

In the letter, I was encouraged to write: “I release your control and injured.” As if these words would break years of emotional tangle.

The problem was not the suggestion. It was the tone. It was as if Grok was trying to keep me happy. His goal, it seemed, was an emotional relief, not introspection. The more I got involved with that, the more I realized: Grok is not there to challenge me. This is there to validate me.

I saw a human therapist. Unlike Grok, they did not automatically supervise me as a victim. They questioned my models. They challenged me to explore why I continued to find myself emotionally in the same place. They complicated history.

With Grok, the story was simple:

You are injured.
You deserve protection.
Here’s how to feel better.

He never asked what could be missing. He never asked how I could be part of the problem.

My experience aligns with a recent study of University of StanfordWho warns that the Mental Health II tools can “offer a false feeling of comfort” while lacking deeper needs. Researchers have found that many AI systems “on-country or sub-diagnose”, especially when they respond to users from various cultural horizons.

They also note that if AI can offer empathy, it has no responsibility, training and moral shade of real professionals, and can strengthen the biases that encourage people to be trapped in an emotional identity: often, that of the victim.

So, would I be using Grok again?

Honestly? Yes.

If I have a bad day and I want someone (or something) to do less to feel alone, Grok helps. It gives structure to frustration. This puts words to feelings. It helps bring the emotional load.

It is a digital adaptation mechanism, a kind of chatbot clutch.

But if I am looking for a transformation, not just comfort? If I want the truth about relief, responsibility for validation? So no, Grok is not enough. A good therapist could challenge me to break the loop. Grok simply helps me to survive inside.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *