Alex' Gardenアレックスの庭

Home

❯

5_Archive

❯

Other

❯

Uni

❯

Modules

❯

WS21 22

❯

NL1

❯

VL (1)

❯

VL06

❯

Language Model

1 min read

📙 Language Model

=== a 📊 Probability distribution over words or word sequences==

How?

Unigrams

  • Only use word features and use all words → 🤦‍♂️ Naive Bayes

N-Grams (2+)

  1. Compute the conditional 🎲 Probability for all words w of W:

    • P(w∣W1​..Wn−1​)
  2. Choose the word combination with the highest conditional probability (= most likely follow up word)

🔗 Links

🔮 Word Prediction

Graph View

  • 📙 Language Model
  • How?

Backlinks

  • LLM
  • Probability distribution
  • NL1 Lectures
  • Dense vector