A Review of Learning Rules in Machine Learning - March 8, 2021
In this research meeting, our research intern Alex Cuozzo reviews some notable papers and explains high level concepts related to learning rules in machine learning. Moving away from backpropagation with gradient descent, he talks about various attempts at biologically plausible learning regimes which avoid the weight transport problem and use only local information at the neuron level. He then moves on to discuss work which infers a learning rule from weight updates, and further work using machine learning to create novel optimizers and local learning rules.
Papers / Talks mentioned (in order of presentation):
• “Random synaptic feedback weights support error backpropagation for deep learning“ by Lillicrap et al.:
• Talk: A Theoretical Framework for Target Propagation:
• “Decoupled Neural Interfaces using Synthetic Gradients“ by DeepMind:
• Talk: Brains@Bay Meetup (Ra
3 views
11
3
6 minutes ago 00:00:23 1
Beyblade X episode 65 review
1 week ago 00:04:29 1
K.O. | Netflix Movie Review (2025)
2 weeks ago 00:04:15 1
This Switch Is So Smart It Might Hack Your Brain | NETGEAR GS728TP Review (PoE Beast Mode) - YouTube