Want to create an interactive transcript for this episode?
Podcast: Data Skeptic
Episode: [MINI] Backpropagation
Description: Backpropagation is a common algorithm for training a neural network. Β It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network. Β In this episode, we compare this concept to finding a location on a map, marble maze games, and golf.