Asking chatbots for short answers can increase hallucinations, study finds

Turns out, telling an AI chatbot to be concise could make it hallucinate more than it otherwise would have. That’s according to a new study from Giskard, a Paris-based AI testing company developing a holistic benchmark for AI models. In a blog post detailing their findings, researchers at Giskard say prompts for shorter answers to […]

May 8, 2025 - 13:06
 0
Asking chatbots for short answers can increase hallucinations, study finds
Turns out, telling an AI chatbot to be concise could make it hallucinate more than it otherwise would have. That’s according to a new study from Giskard, a Paris-based AI testing company developing a holistic benchmark for AI models. In a blog post detailing their findings, researchers at Giskard say prompts for shorter answers to […]