[{"data":1,"prerenderedAt":77},["ShallowReactive",2],{"term-b\u002Fbackpropagation":3,"related-b\u002Fbackpropagation":60},{"id":4,"title":5,"acronym":6,"body":7,"category":40,"description":41,"difficulty":42,"extension":43,"letter":44,"meta":45,"navigation":46,"path":47,"related":48,"seo":54,"sitemap":55,"stem":58,"subcategory":6,"__hash__":59},"terms\u002Fterms\u002Fb\u002Fbackpropagation.md","Backpropagation",null,{"type":8,"value":9,"toc":33},"minimark",[10,15,19,23,26,30],[11,12,14],"h2",{"id":13},"eli5-the-vibe-check","ELI5 — The Vibe Check",[16,17,18],"p",{},"Backpropagation is how errors flow backwards through a neural network during training. The model makes a prediction, computes how wrong it was, and then sends that 'wrongness signal' backwards through all the layers, assigning blame to each weight. Then gradient descent uses those blame scores to fix the weights. It's the entire reason neural networks can learn.",[11,20,22],{"id":21},"real-talk","Real Talk",[16,24,25],{},"Backpropagation is the algorithm for computing gradients of the loss function with respect to all model parameters. Using the chain rule of calculus, it propagates the error signal backward from the output layer through all intermediate layers. These gradients are then used by the optimizer (e.g., Adam) to update parameters.",[11,27,29],{"id":28},"when-youll-hear-this","When You'll Hear This",[16,31,32],{},"\"Backpropagation through time is tricky for sequential models.\" \u002F \"The exploding gradient problem breaks backpropagation.\"",{"title":34,"searchDepth":35,"depth":35,"links":36},"",2,[37,38,39],{"id":13,"depth":35,"text":14},{"id":21,"depth":35,"text":22},{"id":28,"depth":35,"text":29},"ai","Backpropagation is how errors flow backwards through a neural network during training.","advanced","md","b",{},true,"\u002Fterms\u002Fb\u002Fbackpropagation",[49,50,51,52,53],"Gradient Descent","Loss Function","Training","Neural Network","Weights",{"title":5,"description":41},{"changefreq":56,"priority":57},"weekly",0.7,"terms\u002Fb\u002Fbackpropagation","s_XV0w0xiZtSbWtDzkHaxW-GxmuFM5ciT_iorFKj8EQ",[61,64,68,71,74],{"title":49,"path":62,"acronym":6,"category":40,"difficulty":42,"description":63},"\u002Fterms\u002Fg\u002Fgradient-descent","Gradient Descent is how an AI learns — it's the algorithm that nudges the model's weights in the right direction after each mistake.",{"title":50,"path":65,"acronym":6,"category":40,"difficulty":66,"description":67},"\u002Fterms\u002Fl\u002Floss-function","intermediate","The loss function is the AI's score of how badly it's doing.",{"title":52,"path":69,"acronym":6,"category":40,"difficulty":66,"description":70},"\u002Fterms\u002Fn\u002Fneural-network","A neural network is a system loosely inspired by the human brain — lots of little math nodes connected together, passing numbers to each other.",{"title":51,"path":72,"acronym":6,"category":40,"difficulty":66,"description":73},"\u002Fterms\u002Ft\u002Ftraining","Training is the long, expensive process where an AI learns from data.",{"title":53,"path":75,"acronym":6,"category":40,"difficulty":66,"description":76},"\u002Fterms\u002Fw\u002Fweights","Weights are the numbers inside a neural network that determine what it knows and how it behaves — they're the AI's 'brain cells.",1776518259462]