Transformers from scratch: 5. Positional Encoding

Repo link: Transformers don’t have any inherent form of encoding sequences. Positional Encoding is a strategy we can use to encode the positions of the words in the sequence. This video guides us through the code and theory of how positional encoding works
Back to Top