๐ Language Model
= a ๐ Probability distribution over words or word sequences
How?
Unigrams
- Only use word features and use all words โ ๐คฆโโ๏ธ Naive Bayes
N-Grams (2+)
-
Compute the conditional ๐ฒ Probability for all words w of W:
-
Choose the word combination with the highest conditional probability (= most likely follow up word)
๐ Links