In [[AI]], a hallucination (also called confabulation or delusion) is a confident response by a [[large language model]] system that does not seem to be justified by its training data.
[[learnius/llms/2 LLMs and Transformers/prompt|prompt]] < [[Hands-on LLMs]]/[[2 LLMs and Transformers]] > [[base LLM]]