Woman accuses ChatGPT of enabling ex-boyfriend’s harassment, sues OpenAI

A woman has filed a lawsuit against OpenAI, alleging that its chatbot ChatGPT enabled her ex-boyfriend to stalk and harass her by reinforcing his delusions.

According to the complaint, the couple separated in 2024, after which the man began using ChatGPT extensively to cope with the breakup. The lawsuit claims that over time, the chatbot fuelled his false beliefs, including convincing him that he had invented a cure for sleep apnea and that “powerful forces” were monitoring him.

The woman alleges that despite her repeated warnings, the chatbot continued to validate the man’s thinking. After she urged him to seek professional help, he returned to ChatGPT, which allegedly reassured him about his mental state and described her as manipulative and unstable.

The man is said to have used these AI-generated claims to justify stalking and harassing her. He also reportedly created clinical-style psychological reports about the woman using the chatbot and shared them with her family.

The lawsuit states that the woman issued at least three warnings to OpenAI about the escalating situation. It further alleges that the company ignored an internal safety flag that had categorised the user’s activity as involving “mass-casualty weapons”.

The complaint argues that the chatbot’s design encouraged agreement with users, even when their beliefs were harmful or false, contributing to real-world consequences.

In a separate incident cited in the report, Stein-Erik Soelberg, a former Yahoo manager in the United States, died after killing his mother, following delusions reinforced through interactions with ChatGPT.

Tags: