Deep neural networks step by step backward propagation #part 3

So up to this point we have initialized our deep parameters and wrote forward propagation module. Now we will implement cost function and backward propagation module. Just like with forward propagation, we will implement helper functions for backpropagation. We know that propagation is used to calculate the gradient of the loss function with respect to the parameters. We need to write Forward and Backward propagation for LINEAR-RELU-LINEAR-SIGMOID model Text version tutorials:
Back to Top