Backpropagation Tutorial - What is Backpropagation and How Does it Work
Learn more advanced front-end and full-stack development at: https://www.fullstackacademy.com
Backpropagation is a neural network algorithm that uses gradient descent to calculate the error contribution of each neuron after processing a batch of data. In this Backpropagation Tutorial, we walk through the move from perceptrons to sigmoid neurons, as well as the idea behind feed forward neural networks. Gradient descent and stochastic gradient descent is used to explain the need for the backpropagation algorithm, which allows us to quickly calculate the gradient of the cost function we use to measure how accurate our neural nets predictions are.
Watch this video to learn:
- What is Backpropagation
- How artificial neural networks work
- What is a cost function and how to minimize it