Tool Hallucination
ELI5 — The Vibe Check
Tool hallucination is when an AI calls a tool that doesn't exist, uses the wrong arguments, or invents a function signature. 'read_database(query=...)' — except your tool is named 'query_db'. The model was close but wrong.
Real Talk
Tool hallucination is the failure mode where an LLM-based agent generates tool calls referencing non-existent tools, fabricating arguments, or misunderstanding tool schemas. Causes: insufficient tool descriptions, schema confusion, or out-of-scope requests. Mitigations: strict schema validation, clear tool docs, and retry-with-error-feedback loops.
When You'll Hear This
"Agent hallucinated a 'delete_user' tool that doesn't exist." / "Tighten the tool descriptions to prevent tool hallucination."
Related Terms
Function Calling
Function Calling is the OpenAI term for what Anthropic calls Tool Use — teaching the AI to call your code functions.
Hallucination
When an AI confidently makes something up — like citing a library that doesn't exist or generating code that calls a function that was never written.
Tool Use
Tool use is when an AI can call external functions, APIs, or programs to do things it can't do alone.