Whether you’re a student, a researcher, or a practitioner, I hope that my detailed, in-depth explanation will give you the real understanding and knowledge that you’re looking for.
GET STARTEDPython Notebooks (hosted on Google Colab) implement key portions of the algorithm from scratch to further illustrate the concepts.
Chapter 1 - Word Vectors - Inspect a Pretrained Model.ipynb
In the Chapter 1 Notebook we'll play around with a pre-trained word model to look at its vocabulary and to try out some of the basic operations commonly performed on word vectors.
Chapter 2 - Skip-gram Architecture.ipynb
In the Chapter 2 Notebook, we'll reinforce our understanding of the skip-gram neural network architecture by implementing a forward pass from scratch.
Chapter 3 - Negative Sampling.ipynb
In the Chapter 3 Notebook, we'll explore the behavior of Negative Sampling and Subsampling of Frequent Words by implementing them on some example word count data.
Chapter 5 - Backprop Example.ipynb
In the Chapter 5 Notebook we'll get hands on with backpropagation and implement the weight updates (for a skip-gram model with negative sampling) from scratch!
Chapter 6 - fastText Training Example.ipynb
In the Chapter 6 Notebook, we will train a word2vec model with subword information (using the ‘fasttext’ model in gensim) on the Wikipedia Attack Comments dataset. We'll look at how the training time and memory requirements compare, as well as the quality of the resulting vectors.
Appendix - Complete word2vec Training Example.ipynb
This Notebook goes through the full process of training a word2vec model using the gensim library. You can use this as a starting point for training your own model on your own dataset.
The Example Code is sold separately from the eBook--check the box to add it to your order during checkout.
50% Complete
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.