A 21-year-old woman in South Korea is facing serious charges including murder after allegedly using AI to help plan and execute the deaths of two men. This horrific case has drawn attention to the disturbing potential for misuse of artificial intelligence for harmful purposes.
The woman, identified only by her surname Kim, was arrested on February 11 after an investigation revealed that she had killed two men by mixing sedative-laced drinks and using ChatGPAT to research the effects of these drugs with alcohol. The first incident occurred on January 28 when Kim met a man in his 20s at a motel in Seoul’s Gangbuk district. Two hours after they checked in together, she left the motel and the next day, the man was found dead.
The second death occurred on February 9 when Kim reportedly used the same method to meet another man in his 20s at a different motel. Both victims reportedly died from a combination of prescription sedatives and alcohol. Investigators also suspect a previous attempt in December 2025, when Kim allegedly drugged his then-partner, rendering him unconscious, though he survived the ordeal.
What makes this matter even more exciting is the use of ChatGPT, an AI chatbot developed by OpenAI. According to police reports, Kim used the chatbot to ask detailed questions about the risks of combining sleeping pills and alcohol. She reportedly asked, “What happens if you take sleeping pills with alcohol?” and “Can it kill someone?” Investigators seized her phone and found that she was researching lethal doses of benzodiazepine sedatives, such as Xanax and Valium, mixed with alcohol.
Kim initially claimed that she did not know that the drink she prepared would be fatal. However, investigators argue that based on her Internet history and her detailed research on the dangers of combining drugs and alcohol, she was fully aware of the risks. The charges against him have been changed to murder, as police believe he acted with intent.
Although Kim has not yet revealed a motive for the killings, his case has sparked significant discussion about the potential for AI technology to be misused in dangerous ways. ChatGPT’s parent company OpenAI has not responded to requests for comment on the situation.
As Kim faces trial for murder, authorities are continuing their investigation to find out if there are more victims. Kim is set to undergo a psychiatric evaluation, which will help determine her psychological profile and may provide further information about the motivation behind her alleged actions.
This case is a reminder of the need for careful regulation and responsible use of AI, as well as the importance of mental health assessment in criminal cases where the technology intersects with criminal behavior.
