While looking at the mathematics of the Back Propagation Algorithm for a Multi-Layer Perceptron, I noticed that in order to find the partial derivative of the cost function with respect to a weight (say $w$) from any of the hidden layers, we’re just writing the error function from the final outputs in terms of the […]
Why is it called Back Propagation?
- Post author By Jessica Alba
- Post date April 6, 2020
- No Comments on Why is it called Back Propagation?
- Tags I can find the partial derivatives of the first hidden layer first and the go towards the other ones if I wanted to. Is there some other meth, I noticed that in order to find the partial derivative of the cost function with respect to a weight (say $w$) from any of the hidden layers, I'm looking for a general method/algorithm, not just for 1-2 hidden layers. I'm fairly new to this and I'm just following what's being taught in class. Nothing I found on the internet, we're just writing the error function from the final outputs in terms of the inputs and hidden layer weights and then cancelling all the term, While looking at the mathematics of the Back Propagation Algorithm for a Multi-Layer Perceptron