VL12 NL1
Question
Notes
Flashcards
What is 🧑 BERT?#card
= a 🧠 Machine Learning technique for ⚙ Natural Language Processing, which uses sparse, contextu^1642441138524 al embeddings.
Define 👀 Contextual embedding#card
= 〰️ Word embedding that takes the context of a wor^1642441138535 d into account
Define 🤷♂️ Non-Contextual embedding#card
= 〰️ Word embedding that does not take the context of a wor^1642441138546 d into account
What’s the 👔 Chomsky normal form?#card
= a 📖 Formal grammar, which fulfills the follo^1642441138555 wing criteria:
no empty right-hand side every right-hand side has either 2x non-terminals or 1x terminal
What’s a 🏗 Formal generative grammar?#card
= a fully explicit 📖 ^1642441138563 Formal grammar
What’s the formal definition of a 🏗 Formal generative grammar?#card
- : finite set of non-terminal symbols
- : finite set of terminal symbols disjoint from
- : start symbol,
- : finite set of rules of type
- conaining min 1 non-ter^1642441138570 minal symbol
List 3/4 [[Natural language#Phenomena|🗣 Natural language#Phenomena]]#card
- Subcategorization
- verbs can take certain complements
- but can also have number of optional modifiers
- e.g. “I sleep here”, “I hardly ever sleep here”
- Agreement
- main verb agrees with person and number
- e.g. “We drive a car”, “He drives a car”
- main verb agrees with person and number
- Movement
- certain constituents can be moved around
- e.g. “He drove back”, “Back he drove”
- Coordination
- phrases can be conjoined by means of and and or
- “John and Ni^1642441138578 ck drove home”