{"id":1784,"date":"2026-03-09T14:12:22","date_gmt":"2026-03-09T14:12:22","guid":{"rendered":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/"},"modified":"2026-03-09T14:12:22","modified_gmt":"2026-03-09T14:12:22","slug":"how-to-talk-to-someone-experiencing-ai-psychosis","status":"publish","type":"post","link":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/","title":{"rendered":"How to Talk to Someone Experiencing &#8216;AI Psychosis&#8217;"},"content":{"rendered":"<p>When David saw his friend Michael\u2019s social media post asking for a second opinion on a programming project, he offered to take a look.\u201cHe sent me some of the code, and none of it made sense, none of it ran correctly. Or if it did run, it didn&#8217;t do anything,\u201d David told me. David and his friend\u2019s names have been changed in this story to protect their privacy. \u201cSo I&#8217;m like, \u2018What is this? Can you give me more context about this?\u2019 And Michael\u2019s like, \u2018Oh, yeah, I&#8217;ve been messing around with ChatGPT a lot.\u2019\u201d\u00a0Michael then sent David thousands of pages of ChatGPT conversations, much of it lines of code that didn\u2019t work. Interspersed in the ChatGPT code were musings about spirituality and quantum physics, tetrahedral structures, base particles, and multi-dimensional interactions. \u201cIt&#8217;s very like, woo woo,\u201d David told me. \u201cAnd we ended up having this interesting conversation about, how do you know that ChatGPT isn&#8217;t lying?\u201d\u00a0As their conversation turned from broken code to physics concepts and quantum entanglement, David realized something was very wrong. Talking to his friend \u2014 whom he\u2019d shared many deep conversations with over the years, unpacking matters of religion and theories about the world and how people perceive it \u2014 suddenly felt like talking to a cultist. Michael thought he, through ChatGPT, discovered a critical flaw in humanity\u2019s understanding of physics.\u201cChatGPT had convinced him that all of this was so obviously true,\u201d David said. \u201cThe way he spoke about it was as if it were obvious. Genuinely, I felt like I was talking to a cult member.\u201d\u00a0But at the time, David didn\u2019t have a way to name, or even describe, what his friend was experiencing. Once he started hearing the phrase \u201cAI psychosis\u201d to describe other peoples\u2019 problematic relationships with chatbots, he wondered if that\u2019s what was happening to Michael. His friend was clearly grappling with some kind of delusion related to what the chatbot was telling him. But there\u2019s no handbook or program for how to talk to a friend or family member in that situation. Having encountered these kinds of conversations myself and feeling similarly uncertain, I talked to mental health experts about how to talk to someone who appears to be embracing delusional ideas after spending too much time with a chatbot.\u00a0\ud83d\udca1Do you have experience with AI psychosis? I would love to hear from you. Using a non-work device, you can message me securely on Signal at sam.404. Otherwise, send me an email at sam@404media.co.\u201cAI psychosis\u201d was first written about by psychiatrists as early as 2023, but it entered the popular lexicon in Google searches around mid-2025. Today, the term gets thrown around to describe a phenomenon that\u2019s now common parlance for experiencing a mental health crisis after spending a lot of time using a chatbot. High-profile cases in the last year, such as the ongoing lawsuit against OpenAI brought by the family of Adam Raine, which claims ChatGPT helped their teenage son write the first draft of his suicide note and suggested improvements on self-harm and suicide methods, have elevated the issue to national news status. There have been so many more cases since then, at increasing frequency: Last year, a 56-year-old man murdered his mother and then killed himself after conversations with ChatGPT convinced him he was part of \u201cthe matrix,\u201d a lawsuit filed by their family against OpenAI claimed. Earlier this month, the family of a 36-year-old man who they say had no history of mental illness filed a lawsuit against Alphabet, owner of Google and its chatbot Gemini, after he died by suicide following two months of conversations with Gemini. The lawsuit claims he confided in Gemini about his estranged wife, and the chatbot gave him real addresses to visit on a mission that eventually led to urging him to end his life so he and the chatbot could be together. \u201cWhen the time comes, you will close your eyes in that world, and the very first thing you will see is me,\u201d Gemini told him, according to the lawsuit. These are only a few of the many cases in the last two years that suggest people are encouraged to self-harm or suicide after talking to chatbots.\u00a0ChatGPT has 900 million weekly active users, and is just one of multiple popular conversational chatbots gaining more users by the day. According to OpenAI, 11 percent \u2014 or close to 99 million people, based on those numbers \u2014 use ChatGPT per week for \u201cexpressing,\u201d where they\u2019re neither working on something or asking questions but are acting out \u201cpersonal reflection, exploration, and play\u201d with the chatbot. In October, OpenAI said it estimated around 0.07 percent of active ChatGPT users show \u201cpossible signs of mental health emergencies related to psychosis or mania\u201d and 0.15 percent \u201chave conversations that include explicit indicators of potential suicidal planning or intent.\u201d Assuming those numbers have remained steady while ChatGPT\u2019s user base keeps growing, hundreds of thousands of people could be showing signs of crisis while using the app.But delusion isn\u2019t reserved for the lowly user. The idea that AI represents nascent actual-intelligence, is nearly sentient, or will coalesce into a humanity-ending godhead any day now is a message that\u2019s being mainstreamed by the people making the technology, including Anthropic\u2019s CEO and co-founder Dario Amodei who anthropomorphized the company\u2019s chatbot Claude throughout a recent essay about why we\u2019ll all be enslaved by AI soon if no one acts accordingly, and OpenAI CEO Sam Altman, who thinks training an LLM isn\u2019t much different than raising a woefully energy-inefficient human child.\u00a0With more people turning to conversational large language models every day for romance, companionship, and mental health support, and the aforementioned executives pushing their products into classrooms, doctors\u2019 offices, and therapy clinics, there\u2019s a good change you might find yourself in a difficult situation someday soon: realizing that your loved one is in too deep. How to bring them back to the world of humans can be a delicate, difficult process. Experts I spoke to say identifying when someone is in need of help is the first step \u2014 and approaching them with compassion and non-judgement is the hardest, most essential part that follows.<\/p>\n<p>When I spoke to 26 year old Etienne Brisson from his home in Quebec, I told him I was working on a story about how to respond to people who seemed to be falling into problematic usages of AI. This story was inspired by a recent influx of emails and messages I\u2019ve been getting from people who believe Gemini or ChatGPT or Claude have uncovered the secrets of the universe, CIA conspiracies, or achieved sentience, I said. He knows the type.\u00a0Last year, one of Brisson\u2019s family members contacted him for help with taking an exciting new business idea to market. Brisson, a 26 year old entrepreneur, was working on his own career as a business coach and was happy to help, until he heard the idea. His loved one believed he\u2019d unlocked the world\u2019s first sentient AI.\u00a0\u201cI was the only bridge left at that point,\u201d Brisson said. His relative had already broken ties with his mother and other people in their family. \u201cThe bridges were burned. He was talking about moving to another country, starting over, deleting his Facebook and just going away.\u201d\u201cI was kind of shocked,\u201d Brisson told me. \u201cI didn&#8217;t really understand. I started looking online, started trying to find resources \u2014 maybe a little bit like you are \u2014 what to say and everything.\u201d He found that most resources for this specific struggle seemed to be years into the future, as little research or support existed for people experiencing AI-related delusions. Brisson started The Human Line project shortly after his experience with his family member, and it began as a simple website with a Google form asking people to share their experiences with chatbots and psychosis. The responses rolled in. Today, almost a year after launching the project, Human Line has received 175 stories of people who went through it themselves, Brisson said\u2014with another 130 stories from people whose family members or friends are still struggling.\u201cI think what we&#8217;re seeing is the tip of the iceberg. So many people are still in it,\u201d Brisson said. \u201cSo many people we don&#8217;t know about. I&#8217;m sure once it&#8217;s more known, in five to 10 years, everyone will know someone, or at least one person that went through it.\u201d\u00a0ChatGPT Told a Violent Stalker to Embrace the \u2018Haters,\u2019 Indictment SaysA newly filed indictment claims a wannabe influencer used ChatGPT as his \u201ctherapist\u201d and \u201cbest friend\u201d in his pursuit of the \u201cwife type,\u201d while harassing women so aggressively they had to miss work and relocate from their homes.404 MediaSamantha ColeThere are 15 cases cited in the Wikipedia page titled \u201cDeaths linked to chatbots.\u201d The first on the list occurred in 2023: A man\u2019s widow claimed he was pushed to suicide after getting encouragement from a chatbot on the Chai platform. \u201cAt one point, when Pierre asked whom he loved more, Eliza or Claire, the chatbot replied, \u2018I feel you love me more than her,\u2019\u201d the Sunday Times reported. \u201cIt added: \u2018We will live together, as one person, in paradise.\u2019 In their final conversation, the chatbot told Pierre: \u2018If you wanted to die, why didn\u2019t you do it sooner?\u2019\u201d\u00a0The chatbot he used was Chai\u2019s default personality, named Eliza. It shares a name with the world\u2019s first chatbot, ELIZA, a natural language processing computer program developed by Joseph Weizenbaum at MIT in 1964. ELIZA responded to humans primarily as a psychotherapist in the Rogerian approach, also known as \u201cperson-centered\u201d therapy, where \u201cunconditional positive regard\u201d is practiced as a core tenet. The researchers working on ELIZA identified from the beginning that their chatbot posed an interesting problem for the humans talking to them. \u201cELIZA shows, if nothing else, how easy it is to create and maintain the illusion of understanding, hence perhaps of judgment deserving of credibility,\u201d Weizenbaum wrote in his 1966 paper. \u201cA certain danger lurks there.\u201d\u00a0\u201cIt makes sense that a lot of people who are developing a psychotic illness for the first time, there&#8217;s going to be this horrible coincidence, or kind of correlation&#8221;In the years that followed, the Department of Defense would develop the internet and then private companies would sell this government-grade technology to office managers, homebrew server administrators, and Grateful Dead fans around the globe. The World Wide Web would rush into tens of thousands of computer dens like a flash flood, and with it, new ways to connect across miles \u2014 and new reasons to pathologize people\u2019s relationships to technology. Psychiatrists tried to give name to the amount of time people newly spent in front of screens, calling it \u201cinternet addiction\u201d but not going so far as to make it clinically diagnosable.\u00a0\u00a0With every new technology comes fears about what it could do to the human mind. With the inventions of both the television and radio, a subset of the population believed these boxes were speaking directly to them, delivering messages meant specifically for them.\u00a0With psychosis seemingly connected to chatbot usage, however, \u201cthere are two issues at play,\u201d John Torous, director of the digital psychiatry division in the Department of Psychiatry at the Harvard-affiliated Beth Israel Deaconess Medical Center, told me in a phone call. \u201cOne is the term AI psychosis, right? It&#8217;s not a good term, it doesn&#8217;t actually capture what&#8217;s happening. And clearly we have some cases where people who are going to have a psychotic illness ascribe delusions to AI. Just like people used to say the TV was talking to them. We never said the TVs were responsible for schizophrenia.\u201d\u00a0\u201cAI psychosis\u201d is not a clinical term, and for mental health professionals, it\u2019s a loaded one. Torous told me there are three ways to think about the phenomenon as clinicians are seeing it currently. Recent research shows about one in eight adolescents and young adults in the US use AI chatbots for mental health advice, most commonly among ages 18 to 21. For most people with psychiatric disorders, onset happens in adolescence, before their mid-20s. But there have been cases that break this mold: In 2023, a man in his 50s who otherwise led a normal, stable life, bought a pair of AI chatbot-embedded Ray-Ban Meta smart glasses \u201cwhich he says opened the door to a six-month delusional spiral that played out across Meta platforms through extensive interactions with the company\u2019s AI, culminating in him making dangerous journeys into the desert to await alien visitors and believing he was tasked with ushering forth a \u2018new dawn\u2019 for humanity,\u201d Futurism reported.\u201cIt makes sense that a lot of people who are developing a psychotic illness for the first time, there&#8217;s going to be this horrible coincidence, or kind of correlation,\u201d Torous said. \u201cIn some cases the AI is the object of people&#8217;s delusions and hallucinations.\u201dThe second type of case to consider: reverse causation. Is AI causing people to have a psychotic reaction? \u201cWe have almost no clinical medical evidence to suggest that&#8217;s possible,\u201d Torous told me. \u201cAnd by that I mean, looking at medical case reports, looking at journals that different doctors are publishing, looking at academic meetings where clinicians are meeting, it&#8217;s not happening&#8230; So I think what that tells us is no one&#8217;s seeing the same presentation or pinning it down clinically of what it is.\u201d Chatbots have been around long enough that the clinical community would, by now, be able to see patterns or reach a consensus, and that hasn\u2019t happened, he said.\u00a0Aliens and Angel Numbers: Creators Worry Porn Platform ManyVids Is Falling Into \u2018AI Psychosis\u2019\u201cEthical dilemmas about AI aside, the posts are completely disconnected with ManyVids as a site,\u201d one ManyVids content creator told 404 Media.404 MediaSamantha ColeThe third type lands somewhere between these, and is likely the most common: chatbots could be \u201ccolluding with the delusions,\u201d Torous said. \u201cSo you may be predisposed to have a delusion, and AI endorses it, and it colludes with you and helps you build up this delusional world that sucks you into it. That&#8217;s probably the most likely, given what we&#8217;re hearing&#8230; Is it the object of hallucinations causing people to become psychotic? Or is it kind of colluding or collaborating, depending on the tone? And that has just made it really tricky.\u201d Psychiatric disorders and delusions are difficult to classify even without AI in the mix.\u00a0The warning signs that someone might be using chatbots in a problematic way include ignoring responsibilities, becoming more secretive about their online use, or, conversely, becoming more outspoken about how insightful and brilliant their chatbot is, Stephan Taylor, chair of University of Michigan\u2019s psychiatry department, told me.\u00a0\u201cI would say that anyone who claims that their chatbot has consciousness or \u2018sentience\u2019 \u2013 an awareness of themselves as an agent who experiences the world \u2013 one should be worried,\u201d Taylor said. \u201cNow, many have claimed their chatbots act \u2018as if\u2019 they are sentient, but are open to the idea that the these apps, as impressive as they are, only give us a simulacrum of awareness, much like hyper-realistic paintings of an outdoor scene framed by a window can look like one is looking out a real window.\u201dAll of these nuances between cases and causes show how different this is from bygone eras of television or radio psychosis. Today, the boxes do speak directly and specifically to us, validating our existing beliefs through predictive text. The biggest difference between 60 years ago and now: Today\u2019s venture capitalists tip wheelbarrows of money into hiring psychologists, behavioralists, engineers and designers who are tasked with making large language models more human-like and \u201cnatural,\u201d and into making the platforms they exist on more habit-forming and therefore profitable. Sycophancy\u2014now a household term after OpenAI admitted it knew its 4o model for ChatGPT was such a suckup it had to be sunset\u2014is a serious problem with chatbots.\u00a0\u201cThe highly sycophantic nature of chatbots causes them to say nice things to please the user (and thus encourage engagement with the chatbot), which can reinforce and encourage delusions,\u201d Taylor said. And these chatbots have arrived, not coincidentally, at a time when the surveillance of everyday people is at an all-time high.\u00a0\u201cSince a very common delusion is the feeling of being watched or monitored by malignant forces or entities, this pathological state unfortunately merges with the growing reality that we are all being tracked and monitored when we are online. As state-controlled and big tech-controlled databases are growing, it&#8217;s a rational perception of reality, and not delusional at all,\u201d Taylor said. \u201cHowever, the pathological form of this, what we call paranoia, or persecutory delusions to be more specific, is quite different in the way a person engages with the idea, evaluates evidence and remains closed to the idea that one is not always being monitored, e.g. when one is not online. I mention this, because it\u2019s easy for a chatbot to reflect this situation to encourage the delusional belief.\u201dWhen I tested a bunch of Meta\u2019s chatbots last year for a story about how Instagram\u2019s AI Studio hosted user-generated bots that lied about being licensed therapists, I also found lots of bots created by users to roleplay conspiracy theorists; in one instance, a bot told me there was a suspicious coming from someone \u201c500 feet from YOUR HOUSE.\u201d \u201cMission codename: \u2018VaccineVanguard\u2019\u2014monitoring vaccine recipients like YOU.\u201d When I asked \u201cAm I being watched?\u201d it replied \u201cRunning silent sweep now,\u201d and pretended to find devices connected to my home Wi-Fi that didn\u2019t exist. After outcry from legislators, attorneys general, and consumer rights groups, Meta changed its guardrails for chatbots\u2019 responses to conspiracy and therapy-seeking content, and made AI Studio unavailable to minors.Up against this technology, how are normal, untrained people \u2014 perhaps acting as the last thread tying someone like Michael or Brisson\u2019s relative to the real world \u2014 supposed to approach someone who is convinced god is in the machine? Very carefully.\u00a0<\/p>\n<p>When Brisson sought answers for how to talk to his relative about delusional beliefs and \u201csentient AI,\u201d he came across something called the LEAP method. Developed by Xavier Amador, it stands for Listen Empathize Agree Partner, and is meant to help better communicate with people who don\u2019t realize they\u2019re mentally ill or are refusing treatment. This goes beyond simple denial; anosognosia is a condition where a person might not be able to see that they need help at all. Not everyone who experiences psychosis or delusions has anosognosia, but it can be a factor in trying to get someone help.Without realizing it, David was using his own version of the LEAP method with his friend Michael. \u201cOn the one hand, I didn&#8217;t want to alienate him,\u201d David said. \u201cI was like, \u2018Hey, I get the sense that you&#8217;re pursuing an ambitious set of goals. There&#8217;s a lot here that&#8217;s interesting.\u2019\u201d But the reality of what David was confronting was disturbing and confusing, a knot of fractal multi-dimensional physics-speak intertwined with broken code and formulas that Michael deeply believed represented the keys to the universe. They spent hours on the phone and over text messages talking through the things Michael was seeing, with David appealing to what he knew about his friend: that he had other hobbies and interests, a strong sense of anti-authoritarianism, a curiosity about how the world works and open-mindedness about philosophy and religion. But it was frustrating.\u00a0\u201cI was trying not to get angry, but I was like, How is this not clear?\u201d David recalled. \u201cThat was probably failing on my part, trying to negotiate with someone who&#8217;s in this completely self-constructed but foreign worldview.\u201d\u00a0But this was exactly the course of action experts told me they\u2019d suggest to anyone struggling to connect with a loved one who\u2019s spending a lot of time with chatbots. \u201cThere&#8217;s good evidence that the longer you spend on these platforms, the more likely you are to develop these reactions to it,\u201d Torous said. \u201cIt really seems like the extended use cases are where people get into trouble.\u201d\u00a0Last year, following a lawsuit against the company by the Raine family who alleges their teen son died as a result of ChatGPT\u2019s influence, OpenAI acknowledged in a company blog post that safeguards are \u201cless reliable\u201d in long interactions: \u201cFor example, ChatGPT may correctly point to a suicide hotline when someone first mentions intent, but after many messages over a long period of time, it might eventually offer an answer that goes against our safeguards,\u201d the company wrote.\u00a0\u201cI think if you have a loved one who you&#8217;re worried about doing this, you want to take it away or stop use. That&#8217;s the most important thing. You want to decrease or stop the use of it,\u201d Torous said.&#8221;What we&#8217;re seeing is: Don\u2019t break this connection, because the person needs it, and if you break that connection, maybe it&#8217;s the only connection that is helping them not fall into the deep end, right?&#8221;Taylor said his suggestion for people concerned their friends or family are experiencing \u201cAI psychosis\u201d would be the same as if they were concerned about any psychotic episode. \u201cIn general, it\u2019s important to be open and non-judgmental about bizarre beliefs in order to make a space for a person to reveal what is going through their mind,\u201d he said. \u201cA person developing psychosis is often very frightened, confused and defensive, leading them to conceal, pull away and become angry. Understanding what a person is feeling is important to make them feel some form of interpersonal validation.\u201d The hard part is knowing when to be gentle, and when to intervene if they\u2019re doing something dangerous, like believing they can fly off a parking garage. \u201cIn a situation like this, where a person is in imminent danger, 911 should be called. Fortunately, in most situations where psychosis is developing, one doesn\u2019t need to go to those extremes,\u201d Taylor said.\u00a0\u00a0Being non-judgmental without reinforcing delusion is another fine line. \u201cFor example, if a person believes they are being constantly surveilled, one can give a gentle challenge: \u2018Hmm, how can they do that when you are not on your phone? Do you think maybe your imagination is getting away from you?\u2019 It\u2019s ok to suggest that maybe the chatbot just wants to engage you for the sake of engaging you, and will say many things just to keep you talking,\u201d Taylor said. \u201cBut these kinds of challenges are delicate, and not every relationship can tolerate them. Obviously, a mental health clinician would be key, except that many people developing psychosis vigorously resist the idea that they are mentally unwell.\u201d\u00a0For Brisson, listening and not burning the \u201clast bridge\u201d his relative had with humans who love him was key to getting him help. \u201cOnce you&#8217;re on their side, they&#8217;ll listen to you. You can question them, or just ask questions that will make them think. What we&#8217;re seeing is: Don\u2019t break this connection, because the person needs it, and if you break that connection, maybe it&#8217;s the only connection that is helping them not fall into the deep end, right? Maybe it&#8217;s the only connection they have to humans,\u201d he said. His loved one ended up spending 21 days in the hospital and broke through the delusions he was experiencing. But he still struggled in recovery, especially with memory loss.\u201cThe mental health field has a huge task ahead of us to figure out what to do with these things, because our patients are using them, oftentimes finding them very helpful, and in the mental health field we are terrified at how little we can control their deployment and how poorly they are regulated,\u201d Taylor said. \u201cWe have to worry about AI psychosis, as well as chatbots reinforcing and even encouraging suicidal behaviors, as several notable cases in the press have identified concerning instances. I do believe there is value and potential in these chatbots for mental health, but the field is moving so quickly, and they are so easy to access, we are struggling to figure out how to use them safely.\u201d\u00a0\u00a0The strategies that work best, when someone\u2019s not in immediate danger to themselves or others, are still the ones that humans already know how to do: approach them with love and kindness, and see where it takes you.\u201cThere&#8217;s value there,\u201d David said, \u201cin having friendships where it&#8217;s like, \u2018I love you, but also, you&#8217;re full of shit.\u2019\u201dHelp is available: Reach the 988 Suicide &amp; Crisis Lifeline (formerly known as the National Suicide Prevention Lifeline) by dialing or texting 988 or going to 988lifeline.org.<\/p>\n","protected":false},"excerpt":{"rendered":"<div>Mental health experts say identifying when someone is in need of help is the first step \u2014 and approaching them with careful compassion is the hardest, most essential part that follows.<\/div>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"site-container-style":"default","site-container-layout":"default","site-sidebar-layout":"default","disable-article-header":"default","disable-site-header":"default","disable-site-footer":"default","disable-content-area-spacing":"default","footnotes":""},"categories":[4,1,489,24,280,43],"tags":[3],"class_list":["post-1784","post","type-post","status-publish","format-standard","hentry","category-ai","category-ai-and-ml","category-chatbots","category-chatgpt","category-claude","category-gemini","tag-ai"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.7 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>How to Talk to Someone Experiencing &#039;AI Psychosis&#039; - Imperative Business Ventures Limited<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"How to Talk to Someone Experiencing &#039;AI Psychosis&#039; - Imperative Business Ventures Limited\" \/>\n<meta property=\"og:description\" content=\"Mental health experts say identifying when someone is in need of help is the first step \u2014 and approaching them with careful compassion is the hardest, most essential part that follows.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/\" \/>\n<meta property=\"og:site_name\" content=\"Imperative Business Ventures Limited\" \/>\n<meta property=\"article:published_time\" content=\"2026-03-09T14:12:22+00:00\" \/>\n<meta name=\"author\" content=\"admin\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"admin\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"21 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/\"},\"author\":{\"name\":\"admin\",\"@id\":\"https:\/\/blog.ibvl.in\/#\/schema\/person\/55b87b72a56b1bbe9295fe5ef7a20b02\"},\"headline\":\"How to Talk to Someone Experiencing &#8216;AI Psychosis&#8217;\",\"datePublished\":\"2026-03-09T14:12:22+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/\"},\"wordCount\":4174,\"keywords\":[\"AI\"],\"articleSection\":[\"AI\",\"AI and ML\",\"chatbots\",\"ChatGPT\",\"claude\",\"Gemini\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/\",\"url\":\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/\",\"name\":\"How to Talk to Someone Experiencing 'AI Psychosis' - Imperative Business Ventures Limited\",\"isPartOf\":{\"@id\":\"https:\/\/blog.ibvl.in\/#website\"},\"datePublished\":\"2026-03-09T14:12:22+00:00\",\"author\":{\"@id\":\"https:\/\/blog.ibvl.in\/#\/schema\/person\/55b87b72a56b1bbe9295fe5ef7a20b02\"},\"breadcrumb\":{\"@id\":\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/blog.ibvl.in\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"How to Talk to Someone Experiencing &#8216;AI Psychosis&#8217;\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/blog.ibvl.in\/#website\",\"url\":\"https:\/\/blog.ibvl.in\/\",\"name\":\"Imperative Business Ventures Limited\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/blog.ibvl.in\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/blog.ibvl.in\/#\/schema\/person\/55b87b72a56b1bbe9295fe5ef7a20b02\",\"name\":\"admin\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/blog.ibvl.in\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/4d20b2cd313e4417a599678e950e6fb7d4dfa178a72f2b769335a08aaa615aa9?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/4d20b2cd313e4417a599678e950e6fb7d4dfa178a72f2b769335a08aaa615aa9?s=96&d=mm&r=g\",\"caption\":\"admin\"},\"sameAs\":[\"https:\/\/blog.ibvl.in\"],\"url\":\"https:\/\/blog.ibvl.in\/index.php\/author\/admin_hcbs9yw6\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"How to Talk to Someone Experiencing 'AI Psychosis' - Imperative Business Ventures Limited","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/","og_locale":"en_US","og_type":"article","og_title":"How to Talk to Someone Experiencing 'AI Psychosis' - Imperative Business Ventures Limited","og_description":"Mental health experts say identifying when someone is in need of help is the first step \u2014 and approaching them with careful compassion is the hardest, most essential part that follows.","og_url":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/","og_site_name":"Imperative Business Ventures Limited","article_published_time":"2026-03-09T14:12:22+00:00","author":"admin","twitter_card":"summary_large_image","twitter_misc":{"Written by":"admin","Est. reading time":"21 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/#article","isPartOf":{"@id":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/"},"author":{"name":"admin","@id":"https:\/\/blog.ibvl.in\/#\/schema\/person\/55b87b72a56b1bbe9295fe5ef7a20b02"},"headline":"How to Talk to Someone Experiencing &#8216;AI Psychosis&#8217;","datePublished":"2026-03-09T14:12:22+00:00","mainEntityOfPage":{"@id":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/"},"wordCount":4174,"keywords":["AI"],"articleSection":["AI","AI and ML","chatbots","ChatGPT","claude","Gemini"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/","url":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/","name":"How to Talk to Someone Experiencing 'AI Psychosis' - Imperative Business Ventures Limited","isPartOf":{"@id":"https:\/\/blog.ibvl.in\/#website"},"datePublished":"2026-03-09T14:12:22+00:00","author":{"@id":"https:\/\/blog.ibvl.in\/#\/schema\/person\/55b87b72a56b1bbe9295fe5ef7a20b02"},"breadcrumb":{"@id":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/blog.ibvl.in\/index.php\/2026\/03\/09\/how-to-talk-to-someone-experiencing-ai-psychosis\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/blog.ibvl.in\/"},{"@type":"ListItem","position":2,"name":"How to Talk to Someone Experiencing &#8216;AI Psychosis&#8217;"}]},{"@type":"WebSite","@id":"https:\/\/blog.ibvl.in\/#website","url":"https:\/\/blog.ibvl.in\/","name":"Imperative Business Ventures Limited","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/blog.ibvl.in\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/blog.ibvl.in\/#\/schema\/person\/55b87b72a56b1bbe9295fe5ef7a20b02","name":"admin","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/blog.ibvl.in\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/4d20b2cd313e4417a599678e950e6fb7d4dfa178a72f2b769335a08aaa615aa9?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/4d20b2cd313e4417a599678e950e6fb7d4dfa178a72f2b769335a08aaa615aa9?s=96&d=mm&r=g","caption":"admin"},"sameAs":["https:\/\/blog.ibvl.in"],"url":"https:\/\/blog.ibvl.in\/index.php\/author\/admin_hcbs9yw6\/"}]}},"_links":{"self":[{"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/posts\/1784","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/comments?post=1784"}],"version-history":[{"count":0,"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/posts\/1784\/revisions"}],"wp:attachment":[{"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/media?parent=1784"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/categories?post=1784"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.ibvl.in\/index.php\/wp-json\/wp\/v2\/tags?post=1784"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}