VL11 NL1
Question
Notes
Flashcards
How does ๐ word2vec work (general)?#card
General approach:
- Train on binary prediction task
- Is w likely to show up near v?
- Use learned classifier weights as word embeddings
List the steps of the skip-gram task (word2vec)#card
- Approach: Predict if candidate word c is a neighbor
- Positive examples: target word t and context word c (+/- 2 word window)
- Negative examples: Random words from lexicon
- Train a ๐ท ML Classifiers to distinguish the two cases (โ ๐ถ Logistic Regression)
- Iteratively make the embeddings for a word:
- more like the neighbors embeddings
- less like the embeddings of other words
- Iteratively make the embeddings for a word:
- Use the learned weights as the embeddings