When LLMs with language outputs give wrong information, it is appropriately termed: ✅ Confabulation (and in extreme cases when it is too confident when it is wrong- delusions) ❌ Not Hallucination
Share this post
LLMs Don’t Hallucinate- They Confabulate
Share this post
When LLMs with language outputs give wrong information, it is appropriately termed: ✅ Confabulation (and in extreme cases when it is too confident when it is wrong- delusions) ❌ Not Hallucination