The Fundamental Limits7 December 2025 · 10 minsHallucinations Indeterminacy Grounding RAG Citation-Verification Knowledge-GraphsWhy hallucination is an architectural feature of LLMs, not a bug — and what that means for legal AI