ChatGPT accused of saying an innocent man murdered his children
A privacy complaint has been filed against OpenAI by a Norwegian man who claims that ChatGPT described him as a convicted murderer who killed two of his own children and attempted to kill a third. Arve Hjalmar Holmen says that he wanted to find out what ChatGPT would say about him, but was presented with […]


A privacy complaint has been filed against OpenAI by a Norwegian man who claims that ChatGPT described him as a convicted murderer who killed two of his own children and attempted to kill a third.
Arve Hjalmar Holmen says that he wanted to find out what ChatGPT would say about him, but was presented with the false claim that he had been convicted for both murder and attempted murder, and was serving 21 years in a Norwegian prison. Alarmingly, the ChatGPT output mixes fictitious details with facts, including his hometown and the number and gender of his children.
Austrian advocacy group Noyb filed a complaint with the Norwegian Datatilsynet on behalf of Holmen, accusing OpenAI of violating the data privacy requirements of the European Union’s General Data Protection Regulation (GDPR). It’s asking for the company to be fined and ordered to remove the defamatory output and improve its model to avoid similar errors.
“The GDPR is clear. Personal data has to be accurate. And if it’s not, users have the right to have it changed to reflect the truth,” says Joakim Söderberg, data protection lawyer at Noyb. “Showing ChatGPT users a tiny disclaimer that the chatbot can make mistakes clearly isn’t enough. You can’t just spread false information and in the end add a small disclaimer saying that everything you said may just not be true.”
Noyb and Holmen have not publicly revealed when the initial ChatGPT query was made — the detail is included in the official complaint, but redacted for its public release — but says that it was before ChatGPT was updated to include web searches in its results. Enter the same query now, and the results all relate to Noyb’s complaint instead.
This is Noyb’s second official complaint regarding ChatGPT, though the first had lower stakes: in April 2024 it filed on behalf of a public figure whose date of birth was being inaccurately reported by the AI tool. At the time it took issue with OpenAI’s claim that erroneous data could not be corrected, only blocked in relation to specific queries, which Noyb says violates GDPR’s requirement for inaccurate data to be “erased or rectified without delay.”