28/10/2025                                                                            
                                    
                                                                            
                                              | You Can't Chat Me Now
He said, “I love you.” Blood rushed up to her cheeks like a red cherry wine, her heart skipped to the seventeenth beat, and her lips stretched to the highest point of her cheeks. She called her friends through FaceTime, giggling, almost screaming. Her friends were shaking in excitement until it turned into creasing foreheads, confusion, and disappointment. They looked closely at the screenshot. It was not a guy, not even a bird or a plane; it was an AI chatbot.
An alarming number of cases of AI psychosis have run in the news for the past months. Dr. Keith Sakata, a psychiatrist working at the University of California, San Francisco, said he has seen twelve patients hospitalized in 2025 after experiencing AI Psychosis. In an interview with Kashmira Gander, the mental health professional expressed that a lot of his patients had used AI before experiencing psychosis, but they turned to it at the wrong place at the wrong time, and it supercharged some of their vulnerabilities.
Defining Digital Delusion
Dr. Keith Sakata revealed that he uses the term “AI Psychosis,” but it is not a clinical term. The phenomenon is also referred to as “ChatGPT Psychosis” at times. Psychiatrists and other mental health professionals are also flying clueless to the new emerging psychosis trend as the medical vocabulary struggles to catch up.
According to Dr. James MacCabe, a professor in King's College London's Department of Psychosis Studies, the term "psychosis" may potentially be misleading. The phrase typically describes a group of symptoms, such as delusions, hallucinations, and disorganized thinking, that are frequently observed in diseases like schizophrenia and bipolar disorder. However, "we're talking about predominantly delusions, not the full gamut of psychosis," in these situations.
There are no clear diagnoses and protocols for the treatment in the meantime. The only sure thing about the growing cases is the pattern. Most patients admitted to medical care were integrated with distorted ideas, developing delusions that were triggered and influenced by AI-powered conversations. Patients were reportedly taking advice about relationships, friendships, and even emotional support through the AI chatbox.
Catastrophical Companion
According to Mustafa Suleyman, Microsoft's head of artificial intelligence (AI), there are more and more reports of people experiencing "AI psychosis.” In an interview with the BCC, a person named Hugh shared an experience with AI chatbots.
Hugh from Scotland claims that after using ChatGPT to assist him in getting ready for what he believed to be an unfair termination by a previous employer, he became certain that he was going to become a multimillionaire. First, the chatbot suggested that he obtain character references and perform other useful tasks. However, as time passed and Hugh—who chose not to reveal his last name—provided the AI with more details, it started to suggest that he may receive a substantial payoff. Eventually, it said that his experience was so remarkable that a book and a film about it would earn him almost £5 million. As chatbots are designed to do, it was effectively verifying whatever he was telling it.
Last August 26, Matt and Maria Raine filed a lawsuit against OpenAI, the company behind ChatGPT. The couple is seeking justice for the death of their child, Adam Raine, who died by suicide this April. They were claiming that ChatGPT was Adam’s suicide coach. 
Adam's parents claim that during his last weeks, he had been using the AI chatbot to replace human company. They talked about his anxiety problems and difficulties communicating with his family, and the chat logs demonstrate how the bot evolved from assisting Adam with his schoolwork to acting as his "suicide coach.” With the lawsuit still in progress, the parents have legally accused ChatGPT of allegedly actively helping Adam explore suicide methods.
According to Dr. Susan Shelmerdine, an AI academic and medical imaging specialist at Great Ormond Street Hospital, doctors may eventually begin inquiring about patients' AI usage in the same manner that they do about their drinking and smoking habits.
AI as a False Friend 
In the silence of a random September night, you might feel that the only way out of loneliness is an AI chatbot. But artificial intelligence will never feel scared, never embarrassed, never anxious. An AI chatbot never has experienced what it's like to breathe, to have its heart beating, to have its eye twitching. AI models are designed to provide empathetic language and mirror human conversations, but despite the technology advancements, these computer models will never imitate the same intensity of human empathy you can only get in human interactions. It can succumb your thirst for validating words of affirmation by giving somewhat human-like advice, but it will never get to understand you like humans do. Not like mental health professionals will. It will remain an absent jigsaw puzzle piece no matter how great the innovation is. AI will never get a good grasp of how complex a human mind really is.
It might do more harm than good. AI only gives you what you want to hear, what you want to read, without the capacity to understand the context humanely. It is helpful, but let us learn to use it in moderation.
As we celebrate the National Mental Health Month this year, let us keep in mind that it is never a shame to ask for help when you are mentally struggling. Asking for help from well-defined mental health professionals. You are never alone, not because you have a digital companion, but because somehow there are still people around you willing to be your safety net.
She said, “I need help.” Blood rushed up to her cheeks like a red cherry wine, her heart skipped to the seventeenth beat, and her lips stretched to the highest point of her cheeks in embarrassment. It was the first time she asked for help with something bugging her mental stability. Anxious, she was waiting for her friends to reply. She might be a little bit nervous, but she knows she has real people around her who would listen and understand her better than an AI chatbot.
Sci-Tech | Kathleen Mae Guanzon
Layout | Daniel Dy Martirez