fromZDNET
14 hours agoStop saying AI 'hallucinates' - it doesn't. And the mischaracterization is dangerous
The expression "AI hallucination" is well-known to anyone who's experienced ChatGPT or Gemini or Perplexity spouting obvious falsehoods, which is pretty much anyone who's ever used an AI chatbot. Only, it's an expression that's incorrect. The proper term for when a large language model or other generative AI program asserts falsehoods is not a hallucination but a "confabulation." AI doesn't hallucinate, it confabulates.
Artificial intelligence

