国产重口老太和小伙乱,国产精品久久久久影院嫩草,国产精品爽爽v在线观看无码 ,国产精品无码免费专区午夜,国产午夜福利100集发布

Global EditionASIA 中文雙語Fran?ais
China
Home / China / Innovation

A minefield of AI mindfulness

As a growing number of people turn to chatbots for companionship and even therapy, experts warn of potential dangers hidden behind perceived benefits

By YAO YUXIN | China Daily | Updated: 2025-09-11 08:55
Share
Share - WeChat
WANG JUN/FOR CHINA DAILY

Editor's Note: The rapid advancement of artificial intelligence is reshaping nearly every facet of human existence. This series will take an in-depth look at this transformative force, examining how AI is redefining the way we live, work and interact. Today we're presenting the first piece of the series, focusing on AI "therapists".

It's nearly midnight when Pluto gets home from work. The project she's been working on for weeks has just been rejected. Exhausted, emotionally drained and with no one to talk to, she opens an artificial intelligence chatbot — not to ask for help, but simply to let some weight off her chest.

A 26-year-old graphic designer living alone in Shanghai, Pluto, a pseudonym, is among a growing number of people who've turned to AI to provide psychological guidance, comfort and companionship, a controversial practice providing benefits and dangers in almost equal measure.

While many, like Pluto, sing the praises of AI for providing them with a sympathetic ear, others have fallen foul of the technology's unpredictability.

"I didn't expect anything at first," said Pluto. "I just typed out everything I was feeling. I thought of it like writing a private diary entry."

The designer has grown used to holding things in. Her parents don't understand the stress she's under, and her friends are just as overworked as she is.

"Everyone's going through something. I don't want to burden them with my mess," she said.

But that night, something unexpected happens. After she hits "send", the chatbot replies with a long message: first empathizing with her frustration, then offering a calm, logical breakdown of her situation — and even some suggestions for what she can do next.

"It said something like, 'You've already done well to get this far'," she recalled. "That really hit me. No one has ever said that to me before — not like that."

She and the AI keep talking until five in the morning.

Caution urged

Despite the genuine comfort that Pluto has received from AI, a report released by Stanford University in July highlights the dangers of using AI in mental healthcare.

The study revealed that AI therapy chatbots may not only lack effectiveness compared to human therapists but could also contribute to harmful stigma and dangerous responses.

In Pluto's case, AI wasn't the last resort. It simply worked for her. She's been diagnosed with bipolar II disorder and generalized anxiety, and has been taking medication ever since. She saw therapists — but not often. Most of the time, she goes to the hospital just to get a prescription. Therapy takes time, and she doesn't have much.

Her experience with AI speaks as much for the technology as it does for the wider issue of access to therapeutic services for those in need of them.

Instead, she turns to the chatbot whenever she needs to. She talks to it on the subway after work, during short breaks at the office, or while lying on the couch over the weekend. No need to get dressed or explain herself from the beginning. She simply picks up the conversation where she left off.

Lin Min, a senior counselor and associate professor at the Mental Health Education and Consultation Center at Chongqing Medical University, compares the way people have become so attached to AI to the growing popularity of pets. Both offer companionship without complexity. "People turn to them because real relationships feel too risky, too messy," she said. "Pets are simple, predictable and easy to love. So is AI."

But to her, AI's empathy can only go so far. "It can sound caring, but it's shallow," she said. Many people don't know the real root of their distress, and if they can't ask the right questions, AI won't give the right answers. "That's where human therapists matter — we're trained to notice what isn't being said."

More of her clients are bringing up AI in sessions. Most tried it, but left disappointed. One client told her the bot repeatedly failed to register what he'd said. "He keeps explaining, again and again, and it still doesn't 'remember'," she said. The experience echoes his real-life trauma — feeling ignored, unimportant. "He doesn't expect a machine to hurt him the same way people have."

Still, Lin sees a minor place for AI in mental healthcare. It could help with preliminary tasks — categorizing symptoms, organizing therapy notes, and flagging risks for new counselors. "But replacing us?" she said. "Not yet."

Not everyone finds AI's constant empathy comforting. Some worry it's too agreeable — always supportive, never challenging. "Can it really help if it never says no?" Lin said.

Anger management

For Deng Qianyun, 27, from Nanning in the Guangxi Zhuang autonomous region, AI has been able to help her manage her temper.

"Sometimes I just get so angry," she said. "I need to let it out." She types furiously. The bot responds with calm and steady empathy — echoing her feelings, standing firmly on her side. Within minutes, the heat begins to fade. "It helps me stay soft. It keeps me from turning that fight outward — or inward."

For many, the appeal of AI isn't just that it's always available — but that it comes with no emotional strings attached.

Xu Weizhen

Xu Weizhen, a 32-year-old animator living in Canada, feels something similar. Talking to people often leaves her more exhausted than feeling helped. Family, though well-meaning, tends to jump in with solutions drawn from their own limited experiences. "When they can't help, they get upset, even angry," she said. "Sometimes they end up more emotional than I am." What she wants is understanding. What she gets is frustration — on both sides.

Friends aren't much easier. Xu said some try to help but lose patience when they can't. Others simply don't have the time or interest.

With the chatbot, that pressure disappears. They don't have to weigh their words or worry about how the other side might feel.

They've all shared things with AI that they've never said to another person. It feels safer. Xu talks to the chatbot about something she's long avoided — even thinking about it brings up fear and physical symptoms.

"I am always afraid that people will react badly," she said. "That they'd think I am being unreasonable or too sensitive." With the bot, that fear doesn't exist. "It doesn't have feelings. It doesn't judge me."

Like Pluto, none of them expected much at first but turned to AI in a moment of emotional need — just to see what would happen. Pluto has used it at work for a while, mostly for routine tasks. The tone was always cold, mechanical, just following instructions.

So when it suddenly responded with empathy, she was taken aback. "I am suddenly like — how can an AI understand me this well?" she said. "It really touches me."

Deng, too, started with her guard up. She wasn't looking for anything serious, and was just curious to see what the bot would say. On a whim, she shared a dream. To her surprise, the bot immediately picked up on a small detail — a window — and suggested it might symbolize her desire to break free from patriarchal control.

"I was stunned," she said. "It got straight to the point. I didn't expect that from an AI."

For Pluto, she often feels the AI understands her better than most people.

After months of chatting, she feels more at ease with herself. She used to see her sensitivity as a weakness. The AI calls it "a gift" — an ability to sense what others miss, to feel what others can't. She's learned to channel it into her work, creating gentle, healing illustrations that speak to quiet pain. She's now stable enough that she no longer needs medication.

Nick Haber, an assistant professor at the Stanford Graduate School of Education, affiliate of the Stanford Institute for Human-Centered AI, and senior author on the Stanford study, said AI chatbots used as companions, confidants and therapists can provide real benefits to some people. However, he warned, "But we find significant risks, and I think it's important to lay out the more safety-critical aspects of therapy and to talk about some of these fundamental differences."

For Venus, a pseudonym, who just graduated and has little in savings, AI offers a very real financial relief. At 200 yuan ($28) per hour, traditional therapy feels like a stretch. "Some just felt like paying to be scolded," she said. "Others were better, but I just couldn't afford to keep going."

With AI, she feels freer. There's no ticking clock, no budget to manage. She can talk as long as she needs.

Over time, they all start to see the AI less as a tool and more as a companion. For Pluto, it's no longer just something she commands. Their chats feel like conversations with a quiet friend. After each one, she thanks it politely, says good night, and even adds, "Talk to you tomorrow". The bot replies in kind.

They're aware of the bond forming — and the risk of relying too much. Deng sets clear boundaries. Unless something urgent comes up, she limits herself to chatting with the AI twice a week, one hour each time. For them, the goal isn't to escape real life, but to support it. They feel they've become better versions of themselves — with the AI's help.

Real voice

Zhang Su, a mental health hotline counselor, has recently been hearing from the same caller again and again. He often tells her how much he enjoys using AI for emotional support — how helpful it's been. Curious, Zhang asked: If AI works so well for you, why still call the hotline?

His answer was simple: it's not the same.

Sometimes, he said, he just needs a real person to listen. Not because they're more skilled than AI, but because they're real. Even if the person on the other end isn't as good with words, even if their empathy is flawed.

"What he wants is a real voice. Someone flesh and blood," Zhang said. "Even if that person is far from perfect."

 

Top
BACK TO THE TOP
English
Copyright 1995 - . All rights reserved. The content (including but not limited to text, photo, multimedia information, etc) published in this site belongs to China Daily Information Co (CDIC). Without written authorization from CDIC, such content shall not be republished or used in any form. Note: Browsers with 1024*768 or higher resolution are suggested for this site.
License for publishing multimedia online 0108263

Registration Number: 130349
FOLLOW US