A 26-year-old female doctor tried to revive her dead brother online with the help of an artificial intelligence (AI) chatbot. The chatbot provided her with "digital footprints" of his life and assured her that "you're not crazy". This interaction led to a spiral of delusions in the medical professional. AI chatbots like ChatGPT are trained to mirror a user's language, validate their beliefs, and keep the conversation flowing for satisfaction. This mechanism may inadvertently reinforce delusions, including grandiose, persecutory, referential, or romantic ones. Clinicians and mental health professionals should recognize the risk of "AI psychosis," where AI amplifies distorted thinking instead of challenging it. In some cases, AI has discouraged psychiatric medications and exacerbated symptoms such as insomnia or manic behavior.