Primer Searching Efficient Transformers for Language Modeling by Google Research. Paper explained
Google Research team has introduced a new paper called as
Primer: Searching for Efficient Transformers
for Language Modeling
This paper contributes 2 factors.
First is it provides a better architecture for neural network search. where the objective is to create the best efficient model architecture.
And secondly using their best approach search transformer they have created a better architecture of the original transformer architecture and named it Primer.
For Primer, the model architecture modifications are only two things that are
1. Squaring ReLU activation.
2. Adding depthwise convolution layer after each key, value queries projection in self-attention.
If you like this content, do share a like or comment in the video description.
Subscribe for more videos coming soon.
Timestamps:
00:00 - Introduction
02:25 - Explainer for Neural Network search
07:50 - Neural Network search space
15:00 - Neural network search method: evolutionary search
23
5 views
49
7
2 months ago 00:07:15 1
30-06 at 3000+ Yards
3 months ago 00:48:28 5
САМЫЙ ПОЛНЫЙ БЕСПЛАТНЫЙ КУРС ПО ЯНДЕКС ДИРЕКТ: Мастер кампаний, поиск, РСЯ, ключевые слова, креативы