[{"data":1,"prerenderedAt":70},["ShallowReactive",2],{"term-t\u002Ftool-hallucination":3,"related-t\u002Ftool-hallucination":59},{"id":4,"title":5,"acronym":6,"body":7,"category":40,"description":41,"difficulty":42,"extension":43,"letter":44,"meta":45,"navigation":46,"path":47,"related":48,"seo":53,"sitemap":54,"stem":57,"subcategory":6,"__hash__":58},"terms\u002Fterms\u002Ft\u002Ftool-hallucination.md","Tool Hallucination",null,{"type":8,"value":9,"toc":33},"minimark",[10,15,19,23,26,30],[11,12,14],"h2",{"id":13},"eli5-the-vibe-check","ELI5 — The Vibe Check",[16,17,18],"p",{},"Tool hallucination is when an AI calls a tool that doesn't exist, uses the wrong arguments, or invents a function signature. 'read_database(query=...)' — except your tool is named 'query_db'. The model was close but wrong.",[11,20,22],{"id":21},"real-talk","Real Talk",[16,24,25],{},"Tool hallucination is the failure mode where an LLM-based agent generates tool calls referencing non-existent tools, fabricating arguments, or misunderstanding tool schemas. Causes: insufficient tool descriptions, schema confusion, or out-of-scope requests. Mitigations: strict schema validation, clear tool docs, and retry-with-error-feedback loops.",[11,27,29],{"id":28},"when-youll-hear-this","When You'll Hear This",[16,31,32],{},"\"Agent hallucinated a 'delete_user' tool that doesn't exist.\" \u002F \"Tighten the tool descriptions to prevent tool hallucination.\"",{"title":34,"searchDepth":35,"depth":35,"links":36},"",2,[37,38,39],{"id":13,"depth":35,"text":14},{"id":21,"depth":35,"text":22},{"id":28,"depth":35,"text":29},"ai","Tool hallucination is when an AI calls a tool that doesn't exist, uses the wrong arguments, or invents a function signature. 'read_database(query=...","intermediate","md","t",{},true,"\u002Fterms\u002Ft\u002Ftool-hallucination",[49,50,51,52],"Tool Use","Function Calling","Agent Debugging","Hallucination",{"title":5,"description":41},{"changefreq":55,"priority":56},"weekly",0.7,"terms\u002Ft\u002Ftool-hallucination","xuyiEMruYJQSV66XUP7yeBdh7O3d1vraSu7fRiJLRZg",[60,63,67],{"title":50,"path":61,"acronym":6,"category":40,"difficulty":42,"description":62},"\u002Fterms\u002Ff\u002Ffunction-calling","Function Calling is the OpenAI term for what Anthropic calls Tool Use — teaching the AI to call your code functions.",{"title":52,"path":64,"acronym":6,"category":65,"difficulty":42,"description":66},"\u002Fterms\u002Fh\u002Fhallucination","vibecoding","When an AI confidently makes something up — like citing a library that doesn't exist or generating code that calls a function that was never written.",{"title":49,"path":68,"acronym":6,"category":40,"difficulty":42,"description":69},"\u002Fterms\u002Ft\u002Ftool-use","Tool use is when an AI can call external functions, APIs, or programs to do things it can't do alone.",1776518319484]