Hallucination
Hallucination occurs when the AI generates false or unsupported information, presenting it as factual.
Detailed Explanation:
Why It Happens: AI predicts text based on patterns in its training data, but it doesn’t truly "understand" facts.
Impact: Can be problematic in fields like healthcare, legal advice, or academic research where accuracy is critical.
Prevention: Cross-checking AI outputs with reliable sources.
Example: User: "Who discovered oxygen?" AI (Hallucination): "Albert Einstein discovered oxygen in 1774." (Actual Answer: Joseph Priestley and Carl Wilhelm Scheele.)
Last updated