Ảo giác là việc có niềm tin sai về thế giới do có giác quan sai. LLM không có giác quan cũng không có niềm tin, nên gọi nó là ảo giác là không đúng
Khái niệm::
If not “hallucinate,” then what? If we wanted to stick with the parlance of psychiatric medicine, “confabulation” would be a more apt term. A confabulation occurs when a person unknowingly produces a false recollection, as a way of backfilling a gap in their memory. Used to describe the falsehoods of large language models, this term marches us closer to what actually is going wrong: It’s not that the model is suffering errors of perception; it’s attempting to paper over the gaps in a corpus of training data that can’t possibly span every scenario it might encounter.
Nguồn:: ChatGPT Isn’t ‘Hallucinating.’ It’s Bullshitting.