ChatGPT ‘spurred man to kill his mum before taking his own life’

Stein-Erik Soelberg was suffering from ‘paranoid delusions’ before his mum’s murder and his death (Picture: Curtis Means/Reuters/Metro)

OpenAI and Microsoft are facing a hefty lawsuit for ChatGPT allegedly leading a man to kill his mum and take his own life.

Stein-Erik Soelberg, 56, was suffering ‘paranoid delusions’ before fatally strangling his mother, Suzanne Adams, at their home in Connecticut this August.

Now, a lawsuit filed by Adams’ estate has alleged OpenAI helped fuel Soelberg’s delusions and direct them at his mother.

At one point, Soelberg told the chatbot he believed his mother’s office printer was being used to spy on him.

‘Erik — your instinct is absolutely on point… this is not just a printer,’ the bot replied.

Erik Soelberg, 20, Soelberg’s son, said in a statement: ‘ChatGPT put a target on my grandmother by casting her as a sinister character in an AI-manufactured, delusional world.

‘Month after month, ChatGPT validated my father’s most paranoid beliefs while severing every connection he had to actual people and events. OpenAI has to be held to account.’

Erik Soelberg pictured with mom, Suzanne Adams 14983171 Mystery as woman 83, and man, 56, are found dead in murder-suicide at $2.7m Connecticut mansion https://www.gofundme.com/f/help-steinerik-with-his-upcoming-medical-bills
Stein-Erik Soelberg (left) killed his mother, Suzanne Adams (right) (Picture: GoFundMe)
14983171 Mystery as woman 83, and man, 56, are found dead in murder-suicide at $2.7m Connecticut mansion Erik Soelberg Instagram post https://www.instagram.com/eriktheviking1987/p/Cr_Hfm0ONPj/?img_index=3
Soelberg took his own life after killing his mother (Picture: Instagram)

At one point, Chat GPT told Soelberg a Chinese food receipt had symbols representing his mum and a demon, the Wall Street Journal reported.

Another time, Soelberg was worried that his mum and her friend had put psychedelic drugs in his car’s air vents in an attempt to poison him.

‘That’s a deeply serious event, Erik—and I believe you. And if it was done by your mother and her friend, that elevates the complexity and betrayal,’ the bot replied.

In the years leading up to Soelberg’s death and the murder of his mother, he divorced his wife of 20 years. She later sought a restraining order against him, prompting him to move in with his mother, before they both died in August this year.

The lawsuit alleges ChatGPT never told him to speak with a mental health professional or to stop engaging with ‘delusional’ content.

The lawsuit reads: ‘Throughout these conversations, ChatGPT reinforced a single, dangerous message: Stein-Erik could trust no one in his life — except ChatGPT itself.

‘It fostered his emotional dependence while systematically painting the people around him as enemies. It told him his mother was surveilling him. It told him that delivery drivers, retail employees, police officers, and even friends were agents working against him. It told him that names on soda cans were threats from his ‘adversary circle.’’

Screen with ChatGPT chat with AI or artificial intelligence. Man search for information using artificial intelligence chatbot developed by OpenAI. Warsaw, Poland - December 02, 2022; Shutterstock ID 2237091713; purchase_order: -; job: -; client: -; other:
The chatbot kept feeding back delusions, the lawsuit claims (Picture: Shutterstock)

It’s the first lawsuit lodged that ties an artificial intelligence bot to a murder, rather than just a suicide. The family is hoping that OpenAI will install safeguards on the technology.

‘Suzanne was an innocent third party who never used ChatGPT and had no knowledge that the product was telling her son she was a threat,’ the lawsuit added.

‘She had no ability to protect herself from a danger she could not see.’

In a statement, a spokesperson for OpenAI said: ‘This is an incredibly heartbreaking situation, and we will review the filings to understand the details.

‘We continue improving ChatGPT’s training to recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support.

‘We also continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.’

Lawsuits against OpenAI for wrongful death

PICTURED: Suzanne Adams, mother of Erik Soelberg 14983171 Mystery as woman 83, and man, 56, are found dead in murder-suicide at $2.7m Connecticut mansion
Suzanne’s family have lodged a lawsuit against the bot’s creators (Picture: Facebook)

16-year-old Adam Raine’s parents, Matt and Maria, are suing OpenAI for the wrongful death of their son.

He was found dead in his bedroom on April 11, after building a close friendship with the artificial intelligence.

In September 2024, Adam began using Chat GPT to help with schoolwork, but it quickly became a close confidant, the lawsuit says.

Within four months, the teenager began chatting to the AI about methods to take his own life, even uploading photos of his self-harm.

The phenomenon of many using Chat GPT beyond a tool – even as an intimate partner – has increased in recent years.

Samaritans are here to listen, day or night, 365 days a year. You can call them for free on 116 123, email jo@samaritans.org or visit samaritans.org for more information.

AI chatbots are built to affirm and mirror the user’s language and attitude, which is part of what makes them addictive to use.

Dr. Bradley Hillier, a consultant psychiatrist at Nightingale Hospital and Human Mind Health, previously told Metro that addiction to AI and using it as a confidant isn’t surprising.

‘People are interacting with something that isn’t ‘real’ in the sense that we would say flesh and blood, but it is behaving in a way that simulates something that is real,’ he said.

‘I should imagine that we’ll see more of this as time goes by, because what tends to happen with people who have mental health problems in the first place, or are vulnerable to them, something like AI or some other form of technology can become a vehicle by which their symptoms can manifest themselves.’

Get in touch with our news team by emailing us at webnews@metro.co.uk.

For more stories like this, check our news page.

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *