AI Weekly Update - June 16th, 2021 (#35!)

Content Links Below: Generative Models as a Data Source for Multi-View Representation Learning: Learning to See by Looking at Noise: Knowledge Distillation: A Good Teacher is Patient and Consistent: Does Knowledge Distillation Really Work? AdaMatch: Self-Damaging Contrastive Learning: Masked Self-Supervised Transformer for Visual Representation: Data-Efficient Instance Generation from Instance Discrimination: Scaling Vision Transformers: CoAtNet: Improving Language Model Behavior by Training on a Curated Dataset: Dynamic Language Models for Continuously Evolving Conten
Back to Top