A 21-year-old woman in South Korea is at the center of a case that feels ripped from a techno-thriller — a string of alleged motel meetups, drug-laced drinks, and chilling digital searches asking an AI chatbot how deadly a mixture of sleeping pills and alcohol could be.
The woman, identified by her surname Kim, was arrested on Feb. 11 and is now facing upgraded murder charges after police concluded that two men in their 20s died as a result of drinks she allegedly prepared for them. Investigators say she used ChatGPT to research how prescription sedatives interact with alcohol — and whether the combination could kill.
According to reports from the BBC and the Korea Herald, the first death occurred on Jan. 28. Kim allegedly checked into a motel in Seoul’s Gangbuk District with a man in his 20s. Roughly two hours later, she left the motel alone. By the next day, the man was found dead.
Less than two weeks later, on Feb. 9, police say she repeated the pattern. Kim allegedly met another man in his 20s and checked into a different motel in Seoul. Authorities say she used the same method — giving him a drink mixed with drugs — and he too later died.
But the investigation suggests the pattern may have started even earlier.
Officials suspect a previous attempted murder in December 2025 in Namyangju. In that case, Kim allegedly gave her then-partner a drink mixed with sedatives, causing him to lose consciousness. He survived. Police believe that incident may have been a precursor — and possibly a test run — for what followed.
What has stunned investigators and the public alike is the digital trail.
After seizing Kim’s phone, authorities said they found search history and chatbot interactions that appeared to show deliberate research into lethal drug combinations. Among the alleged questions she posed to ChatGPT: “What happens if you take sleeping pills with alcohol?” “How many do you need to take for it to be dangerous?” and “Could it kill someone?”
Police allege that after the December incident, Kim began preparing drinks with significantly higher doses of drugs containing benzodiazepines — central nervous system depressants commonly prescribed for anxiety and insomnia. Medications in this class include drugs like Xanax and Valium. When mixed with alcohol, they can dangerously suppress breathing and heart rate.
Kim has reportedly admitted to mixing her prescribed sedatives into the drinks but claimed during questioning that she did not realize the doses would be fatal. However, one investigator told the BBC that she was “fully aware that consuming alcohol together with drugs could result in death.”
Initially arrested on suspicion of inflicting bodily injury resulting in death, Kim’s charges were later upgraded to murder. Police argued that her internet and chatbot search history demonstrated intent — a key factor in elevating the case.
As of now, authorities have not publicly disclosed a motive. There is no indication of robbery, financial gain, or known personal disputes with the victims. The lack of a clear motive has only deepened the unease surrounding the case.
Police say they are continuing to investigate whether there may be additional victims beyond the three incidents identified so far. The possibility that the pattern extended further has cast a wider shadow over the case.
Meanwhile, Kim is set to undergo a psychopathy assessment as part of the legal proceedings. She will participate in in-depth psychological interviews aimed at building a clearer picture of her mental state and potential risk factors. In South Korea, such evaluations can play a role in sentencing and in determining criminal responsibility.

AI can do many things, from editing writing and presentations, to creating schedules and analyzing data. Chat GPT is depicted on the computer screen here.
Experts frequently note that AI chatbots are designed to provide general information and are programmed with safeguards to prevent assistance with violent wrongdoing. Still, this case underscores a difficult reality: information about drug interactions has long existed online, and technology can be used for both benign and malicious purposes.
Investigators continue to piece together the timeline, the digital footprint, and the psychological profile of the accused. For now, what remains is a case that blends old elements — poison, secrecy, deception — with new ones: search histories, chatbot logs, and a digital record that may prove pivotal in court.
As authorities press forward, the central questions linger: Why? And how many warning signs were hidden in plain sight — not just in motel rooms, but in the quiet glow of a phone screen?





