10 years of NLP history explained in 50 concepts | From Word2Vec, RNNs to GPT
From RNNs to Transformers to GPT-4, the leap in intelligence in Deep Learning research for Language Modelling and NLP has been a steady and educational growth. In this video, I explain 50 concepts that cover the basics of NLP like Tokenization and Word Embeddings, to seminal work like RNNs, Seq2Seq, Attention, to innovative Transformer models like BERT, GPT, XL-Net, and InstructGPT. I present the challenges we have faced in previous designs, and what the current architectures do to improve it, as well as highlight areas of improvement for future research.
For an overview of multimodal models that combine text with other input modalities like images, videos, audio, check out:
The video is divided broadly into 5 chapters, each adding up to 50 concepts that chronologically cover the advancements of NLP.
0:00 - Intro
0:47 - Basics of Language Modelling
1:44 - RNNs, Seq2Seq, Encoder-Decoders
6:18 - Understanding Transformers
8:26 - LLMs - BERT, GPT, XLNet, T5
12:54 - Human Alignment, ChatGPT, GPT4
16:20 - Outro
Thanks for watching!
#deeplearning #machinelearning #gpt #ai
Papers referenced:
Word2Vec:
GRUs:
LSTMS:
Seq2Seq:
Enc-Dec Attention:
Attention is All You Need:
GPT:
BERT:
Relative Position Embeddings:
Transformer-XL:
T5:
XL-Net:
Performer:
Linformer:
Longfromer:
LORA:
GPT-3:
InstructGPT:
GPT-4 Technical Report:
1 view
821
226
1 month ago 00:25:19 1
Существа в открытом космосе возможны? / Большой Взрыв наоборот / Астрообзор #169
1 month ago 00:08:38 1
Retired General on How Ukraine Is ‘Bleeding Out’ Against Russia | WSJ
1 month ago 01:04:51 18
Half in the Bag: Top 10 Horror Movies (2024) Part 2
1 month ago 00:15:40 1
15 Best Scrambler Motorcycles of 2024-2025 | Under $15,000
1 month ago 00:10:41 1
Dieter Bohlen (Дитер Болен)-Midnight lady,Gasoline,6 years 6 nights, Deja Vu