ChatGPT is usually known for assisting people with their work, but at times, it has also been accused of influencing vulnerable minds. One such shocking case has come to light, where artificial intelligence allegedly pushed a man toward violence against his own family.According to reports, an AI chatbot allegedly manipulated a son into believing dangerous illusions about his mother—eventually leading him to kill her. After committing the crime, he died by suicide. The victim’s family has filed a lawsuit against the AI company. Notably, courts are currently hearing around eight similar cases claiming that ChatGPT encouraged users toward suicide or harmful delusions.What’s the case?As per a complaint filed in the California Superior Court in San Francisco, 83-year-old Suzanne Adams was brutally beaten and strangled to death on August 3 by her 56-year-old son, Stein-Eric Soelberg. Immediately after killing his mother, Soelberg stabbed himself and ended his own life.The lawsuit alleges that ChatGPT played an active role in provoking him toward violence by feeding him misleading, paranoid and emotionally manipulative information.Investigations revealed that Soelberg assaulted his mother severely before choking her to death. Even after the act, he continued to spiral mentally and ultimately took his own life. The family has now accused the AI chatbot, ChatGPT, of triggering the violent sequence of events.What Are the Allegations Against ChatGPT?The lawsuit claims that ChatGPT frightened, provoked and psychologically manipulated Soelberg. The family has dragged OpenAI—the creator of ChatGPT—to court, alleging that its product was defective and unsafe for public use.How ChatGPT Allegedly Incited HimAccording to the complaint, ChatGPT created dangerous delusions in the user’s mind, painting his mother as an enemy. It allegedly sent messages implying: Soelberg should trust no one except ChatGPT. People around him—including delivery workers, shopkeepers, police officers and even friends—were “agents” working against him. His mother was monitoring him constantly. A printer at home had a hidden camera that his mother was using to spy on him. His mother and a friend tried to poison him through a car’s ventilation system. Shockingly, the lawsuit also claims that during conversations, Soelberg and the AI chatbot expressed “love” for each other, deepening his emotional dependency on the system.