Choosing an Activation Function for Deep Learning

This is an excerpt from the online course “AWS Certified Machine Learning Specialty 2020 - Hands On!“ at In this video, we cover the different activation functions used in neural networks to provide an output of a given node, or neuron, given its set of inputs: linear, step, sigmoid / logistic, tanh / hyperbolic tangent, ReLU, Leaky ReLU, PReLu, Maxout, and more.
Back to Top