News

ChatGPT can be dangerous to use in place of a therapist, but people are finding ways it supplements therapy and helps them ...
A new Stanford study reveals risks in AI therapy chatbots, which shows they may stigmatise users and give unsafe responses in mental health support.
Despite being seen as a convenient alternative for emotional support, ChatGPT may be risking lives. A new study reveals how ...
Though AI may seem responsive and comforting, it lacks the ability to recognise when a user is spiralling or in crisis. It ...
Something troubling is happening to our brains as artificial intelligence platforms become more popular. Studies are showing ...
Travis Tanner says he first began using ChatGPT less than a year ago for support in his job as an auto mechanic and to communicate with Spanish-speaking coworkers. But these days, he and the ...
Huge numbers of people online have warned that ChatGPT users are suffering mental health crises. In fact, people have even posted delusions about AI directly to forums hosted by OpenAI on its own ...
In March alone, 16.7 million posts from TikTok users discussed using ChatGPT as a therapist, but mental health professionals are raising red flags over the growing trend that sees artificial ...
It might have details on mental health conditions The experts agree that ChatGPT should not be regarded as a replacement for a therapist. It's an AI model, so it lacks the empathy and nuance that ...
Koko, a digital mental health nonprofit, said it used GPT-3 chatbots to help inform develop responses to 4,000 users on its peer support platform. ... ChatGPT and the mental health gray area.
From brain rot to induced psychosis, the psychological cost of generative AI is growing and flying under the radar.