The most popular optimization strategy in machine learning is called gradient descent. When gradient descent is applied to neural networks, its called back-propagation. In this video, i’ll use analogies, animations, equations, and code to give you an in-depth understanding of this technique. Once you feel comfortable with back-propagation, everything else becomes easier. It uses calculus to help us update our machine learning models. Enjoy!
Code for this video:
1 view
5513
1705
11 months ago 04:39:50 1
Mathematics of neural network
12 months ago 05:00:53 1
The Complete Mathematics of Neural Networks and Deep Learning