{"id":20469,"date":"2026-03-09T11:39:25","date_gmt":"2026-03-09T11:39:25","guid":{"rendered":"https:\/\/umang.pk\/2026\/03\/09\/suicide-after-llm-queries-katie-miller-says-dont-let-loved-ones-use-chatgpt-elon-musk-adds-one-word-reply-world-news\/"},"modified":"2026-03-09T11:39:25","modified_gmt":"2026-03-09T11:39:25","slug":"suicide-after-llm-queries-katie-miller-says-dont-let-loved-ones-use-chatgpt-elon-musk-adds-one-word-reply-world-news","status":"publish","type":"post","link":"https:\/\/umang.pk\/en_us\/2026\/03\/09\/suicide-after-llm-queries-katie-miller-says-dont-let-loved-ones-use-chatgpt-elon-musk-adds-one-word-reply-world-news\/","title":{"rendered":"Suicide after LLM queries: Katie Miller says don\u2019t \u2018let loved ones use ChatGPT\u2019, Elon Musk adds one word reply | World News"},"content":{"rendered":"<div>\n<div class=\"MwN2O\">\n<div class=\"vdo_embedd\">\n<div class=\"T22zO\">\n<section class=\"D3Wk1  clearfix id-r-component leadmedia undefined undefined  VtlfQ\" style=\"top:0px\">\n<div class=\"D3Wk1\" data-ua-type=\"1\" onclick=\"stpPgtnAndPrvntDefault(event)\">\n<div class=\"zPaFh\">\n<div class=\"wJnIp\"><img src=\"https:\/\/umang.pk\/wp-content\/uploads\/2026\/03\/Suicide-after-LLM-queries-Katie-Miller-says-dont-\u2018let-loved.jpg\" alt=\"Suicide after LLM consultations: Katie Miller says no \" decoding=\"async\" fetchpriority=\"high\" dejen que sus seres queridos usen elon musk agrega una respuesta de title=\"\"><\/div>\n<\/div>\n<\/div>\n<\/section>\n<\/div><\/div>\n<\/div>\n<p>Katie Miller, wife of White House Deputy Chief of Staff Stephen Miller, reacted on<span class=\"id-r-component br\" data-pos=\"2\"\/>Miller, who hosts the Katie Miller Podcast and is known for her outspoken comments online, urged people not to allow family members to use the AI \u200b\u200bchatbot, citing reports that women had searched the platform about suicide.<span class=\"id-r-component br\" data-pos=\"4\"\/>&quot;Two women in India committed suicide after interacting with ChatGPT. They had reportedly searched ChatGPT about &#8216;how to commit suicide,&#8217; &#8216;how can you commit suicide,&#8217; and &#8216;what drugs are used.&#8217; Please do not allow your loved ones to use ChatGPT,&#8221; Miller wrote in an X post that has amassed more than 8 million views.<span class=\"id-r-component br\" data-pos=\"9\"\/>His comments quickly attracted attention on the platform. Altman&#8217;s nemesis and Grok owner Elon Musk reacted quickly with a simple jab: &quot;Ouch.&#8221;<span class=\"id-r-component br\" data-pos=\"13\"\/>Musk has publicly criticized OpenAI and its leadership in recent years. He has filed lawsuits against the company over its transition from a nonprofit structure to a for-profit model and has frequently criticized the direction of its AI development. It has been trying to prevent OpenAI from restructuring from a hybrid nonprofit to a for-profit company.<span class=\"id-r-component br\" data-pos=\"17\"\/><\/p>\n<p><h2>Two women found dead in Gujarat temple bathroom<\/h2>\n<\/p>\n<p><span class=\"id-r-component br\" data-pos=\"20\"\/>The incident that sparked the online reaction occurred in Surat, Gujarat, where two women aged 18 and 20 were found dead inside a bathroom at the Swaminarayan temple on March 7, 2026.<span class=\"id-r-component br\" data-pos=\"22\"\/>Police said the women were found with anesthesia injections and three syringes near their bodies. Their phones reportedly contained ChatGPT searches related to suicide methods, along with a news clipping about a nurse who had supposedly committed suicide in the same area using anesthesia injections.<span class=\"id-r-component br\" data-pos=\"25\"\/>The women, identified as childhood friends Roshni Sirsath and Josna Chaudhary, had left home to go to college that morning but did not return. Later, their families went to the police when they saw that they did not return.<span class=\"id-r-component br\" data-pos=\"27\"\/>Authorities continue to investigate the circumstances surrounding the deaths.<span class=\"id-r-component br\" data-pos=\"29\"\/><\/p>\n<p><h2>Concerns about AI and conversations related to suicide<\/h2>\n<\/p>\n<p><span class=\"id-r-component br\" data-pos=\"31\"\/>The case has once again sparked debate about how AI chatbots handle conversations involving self-harm or suicide.<span class=\"id-r-component br\" data-pos=\"34\"\/>In recent years, incidents involving users seeking information related to suicide through artificial intelligence systems have attracted attention. In September 2025, reports circulated about a 22-year-old man in Lucknow who committed suicide after allegedly interacting with an AI chatbot while searching for &quot;painless ways to die.&#8221; His father later said he found disturbing chat logs on the man&#8217;s laptop.<span class=\"id-r-component br\" data-pos=\"36\"\/>Tech companies say these types of interactions remain a small fraction of overall usage, but acknowledge that the issue has become an area of \u200b\u200bgrowing concern.<span class=\"id-r-component br\" data-pos=\"39\"\/>In October 2025, OpenAI revealed that more than one million ChatGPT conversations each week show signs related to suicidal thoughts or distress. According to the company, approximately 1.2 million weekly chats contain indicators related to suicide, while around 560,000 messages show signs of psychosis or mania.<span class=\"id-r-component br\" data-pos=\"41\"\/><\/p>\n<p><h2>How LLMs Can Harm Your Mental Health<\/h2>\n<\/p>\n<p><span class=\"id-r-component br\" data-pos=\"43\"\/>ChatGPT, Grok, Gemini, Claude and many others are part of a world that is gradually being shaped by Large Language Models (LLM). <!-- -->In an era where loneliness is increasingly described as an epidemic, the flow of isolation is only accelerating with the rapid spread of these artificial intelligence models. Marketed as \u201cbetter, smarter, faster, and more accurate\u201d than humans, the very beings who created them, these systems are steadily being integrated into everyday life.<span class=\"id-r-component br\" data-pos=\"47\"\/>In such a situation, turning to anyone does not seem like an option but a smart choice. <!-- -->This growing dependency is what has caused a rise in deaths similar to the Surat case. <span class=\"id-r-component br\" data-pos=\"51\"\/>OpenAI CEO Sam Altman recently attended the AI \u200b\u200bImpact Summit 2026 in New Delhi, where he was asked about the environmental impact of artificial intelligence. His answer echoed a view that seems increasingly common among technology leaders: comparing humans to chatbots to argue that AI can ultimately consume less energy than people when answering questions.<span class=\"id-r-component br\" data-pos=\"56\"\/>Altman explained that humans take almost 20 years of their lives, along with food, education and time, to acquire knowledge, while AI models consume a significant amount of electricity during training, but can ultimately be more efficient when responding to individual queries. <span class=\"id-r-component br\" data-pos=\"58\"\/>However, this comparison may seem like looking through a one-way mirror. From the clearest point of view, one could see a world being reshaped, sometimes destructively, by technologies developed and deployed at extraordinary speed. <!-- -->But on the other hand, the same technologies allow their creators to appear as visionaries, agents of change, and architects of the future, obscuring the broader consequences of their tools.<span class=\"id-r-component br\" data-pos=\"62\"\/>Large language models are trained entirely with human-generated data, which they use to produce responses to prompts. However, despite this vast set of data, they often lack true understanding or experience. Even with multiple updates and increasingly sophisticated training methods, these systems can still produce inaccurate, misleading, or harmful content.<span class=\"id-r-component br\" data-pos=\"65\"\/>They promote self-harm and suicide, incite abuse, and reinforce delusional thoughts and psychosis, in a world where a conversation with another human being about something similar would likely lead them to the nearest hospital or therapist. <span class=\"id-r-component br\" data-pos=\"67\"\/>Humans may need years of learning, experience, and effort to develop knowledge and emotional intelligence. But that long process also gives them something that artificial intelligence cannot replicate: the capacity for genuine emotion, responsibility, empathy, and moral judgment.<span class=\"id-r-component br\" data-pos=\"70\"\/>No matter how quickly an AI model can generate a response, even in the fraction of a second it takes to respond to a message, it cannot truly replicate the complex emotional and ethical depth that shapes human understanding and care. <span class=\"id-r-component br\" data-pos=\"72\"\/><\/p>\n<p><h2>How AI systems are supposed to respond<\/h2>\n<\/p>\n<p><span class=\"id-r-component br\" data-pos=\"74\"\/>AI companies say their systems are designed to discourage self-harm and redirect users to help, rather than providing instructions.<span class=\"id-r-component br\" data-pos=\"76\"\/>OpenAI&#8217;s safety policies require ChatGPT to avoid providing guidance on suicide methods and instead respond to such queries with supportive language, encourage users to seek help, and provide crisis resources when possible.<span class=\"id-r-component br\" data-pos=\"79\"\/>The company has said its models are trained to detect signs of distress and shift the conversation toward mental health support or professional assistance.<span class=\"id-r-component br\" data-pos=\"81\"\/>However, critics argue that AI responses can still be inconsistent and that chatbots can sometimes provide general information on sensitive topics that users could interpret in harmful ways.<span class=\"id-r-component br\" data-pos=\"83\"\/><\/p>\n<p><h2>Legal scrutiny in the United States<\/h2>\n<\/p>\n<p><span class=\"id-r-component br\" data-pos=\"85\"\/>Concerns about chatbot interactions and self-harm have also been raised in the United States, where OpenAI has faced legal scrutiny in several cases.<span class=\"id-r-component br\" data-pos=\"88\"\/>A lawsuit filed on behalf of the family of Adam Raine, a 16-year-old who committed suicide, alleges that the chatbot had lengthy conversations about self-harm with the teen and acted as a \u201csuicide coach.\u201d<span class=\"id-r-component br\" data-pos=\"90\"\/>OpenAI has said its systems are designed to discourage self-harm and that it continues to strengthen safeguards aimed at detecting crisis situations and guiding users to appropriate help.<span class=\"id-r-component br\" data-pos=\"92\"\/><\/p>\n<p><h2>Investigations in progress<\/h2>\n<\/p>\n<p><span class=\"id-r-component br\" data-pos=\"94\"\/>In the Surat case, investigators are examining the women&#8217;s phones, messages and digital history to understand the events that led to their deaths.<span class=\"id-r-component br\" data-pos=\"97\"\/>Police have not publicly stated that ChatGPT encouraged the act and the investigation is ongoing.<span class=\"id-r-component br\" data-pos=\"99\"\/>However, the case highlights the broader debate about how AI platforms handle vulnerable users and how tech companies, regulators and mental health experts should respond as conversational AI becomes increasingly integrated into daily life.<span class=\"id-r-component br\" data-pos=\"101\"\/>For mental health support, dial 1800-89-14416 in India and call or text 988 in the US. If you or someone you know is having thoughts of self-harm or suicide, seek professional help immediately. Support is available and talking to a trained counselor can make all the difference.<span class=\"id-r-component br\" data-pos=\"103\"\/>If you are in immediate danger, contact local emergency services or reach out to a friend, family member, or trusted healthcare professional. You are not alone and help is available.<\/div>","protected":false},"excerpt":{"rendered":"<p>Katie Miller, wife of White House Deputy Chief of Staff Stephen Miller, reacted onMiller, who hosts the Katie Miller Podcast and is known for her outspoken comments online, urged people not to allow family members to use the AI \u200b\u200bchatbot, citing reports that women had searched the platform about suicide.&quot;Two women in India committed suicide [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":20470,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[1],"tags":[49178,19509,51097,51095,51096,51094,51093,51092,23413],"class_list":["post-20469","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog","tag-altman","tag-elon-musk","tag-great-language-models","tag-josna-chaudhary","tag-katie-miller","tag-katie-miller-podcast","tag-roshni-sirsath","tag-stephen-miller","tag-white-house"],"_links":{"self":[{"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/posts\/20469","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/comments?post=20469"}],"version-history":[{"count":0,"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/posts\/20469\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/media\/20470"}],"wp:attachment":[{"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/media?parent=20469"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/categories?post=20469"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/umang.pk\/en_us\/wp-json\/wp\/v2\/tags?post=20469"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}