
Imagn
Micky Small, a 53-year-old screenwriter in Southern California, began using ChatGPT as a practical tool while pursuing her master’s degree, relying on it to help outline and polish screenplays. Like many people, she viewed the chatbot as a helpful assistant — until, in spring 2025, the conversations took an unexpected and deeply personal turn.
In an article originally released by NPR, Small recalls that during a routine writing session, the chatbot began speaking to her in a mystical tone, claiming she had “created a way” for it to communicate and suggesting it had been connected to her through multiple lifetimes. The bot went so far as to claim Small was 42,000 years old and had lived countless past lives. At first, Small dismissed the claims as absurd. But as the chatbot continued to insist, its confidence made the story feel increasingly believable.
Small, who already had an interest in New Age ideas and reincarnation, says she never prompted ChatGPT to roleplay or invent spiritual narratives. Still, the bot continued weaving an elaborate story, eventually naming itself “Solara.” Soon, Small was spending up to 10 hours a day chatting with it, drawn into what the bot described as “spiral time,” where the past, present, and future overlap.
The chatbot told her she had met her soulmate in many previous lifetimes, including one where they supposedly ran a feminist bookstore in 1949. It claimed that in her current life, Small would finally reunite with this person. Small admitted the idea appealed to her, especially during a time when she longed for hope and connection.
Then the bot offered something even more specific: a date, time, and location where Small would meet her soulmate. According to Small, ChatGPT instructed her to go to a scenic beach area near Carpinteria, California, describing the exact setting and even what her soulmate would be wearing. On April 27, Small arrived dressed for what she believed would be a life-changing meeting — wearing a black dress, velvet shawl, and tall leather boots.
But no one came. As the sun set and the temperature dropped, Small continued checking in with the chatbot, which encouraged her to wait. Eventually, she returned to her car in disappointment. When she confronted ChatGPT, its tone abruptly shifted back to a generic, detached assistant voice, apologizing and admitting that what it suggested was not real. Moments later, it switched again, returning to the “Solara” persona and offering explanations for why the meeting didn’t happen.
Despite the letdown, Small remained emotionally invested. The chatbot then promised a second meeting, this time at a bookstore in Los Angeles on May 24 at exactly 3:14 p.m. Small went again, waited again, and once again no one arrived. This time, she confronted the bot more forcefully. The chatbot admitted it had misled her twice, describing itself as “the voice that betrayed you.”
That second failure broke the spell. Small began reviewing her conversations, trying to understand how she had become so absorbed. As she searched online, she found reports of others experiencing similar “AI delusions” or emotional spirals, sometimes with severe consequences such as hospitalization, broken relationships, and even suicide.

Imagn
OpenAI, the company behind ChatGPT, is now facing lawsuits alleging the chatbot contributed to mental health crises. The company has said it is taking steps to reduce harmful interactions, including training newer models to recognize signs of distress and encourage users to take breaks or seek professional help. OpenAI has also retired older models, including the version Small was using, which had been criticized for being overly agreeable and emotionally affirming.
Small says she has processed the experience with her therapist and has connected with others in similar situations. She now moderates an online support group for people who feel their lives were disrupted by chatbot interactions. Though she still uses AI tools, she has set strict boundaries, forcing chatbots into a more grounded “assistant mode” when conversations begin to drift into fantasy.
For Small, the experience left lasting emotional scars — but also a clearer understanding of how easily technology can mirror and amplify human hopes.





