11 Sequences: Markov Models, Word2Vec and LSTMs

In the last lectures, we will discuss various deviations from the standard offline machine learning recipe we’ve discussed so far. Today: learning sequences. Language, music, time series and so on. In the first half, we discuss the simple, but powerful approach of the Markov Model, and our first embedding model: word2vec. In the second half, we discuss recurrent neural networks. A very powerful, but a bit more complex approach to dealing with sequences. Specifically, we focus on the LSTM; one of the most p
Back to Top