Dynamic Learning Platform
tl;dr: AI-native learning platform for acquiring languages dynamically
Why?
Current language learning tools are still mostly software 1.0, or not very user-friendly for the masses
What?
- Pages (can be opened by clicking small ribbon items on the left side; like in vs code or obsidian):
- Chat (main tab)
- Knowledge
Chat (main page)
The chat tab is a basic AI-Chat that offers all the features you would expect
Special features:
- two modes: normal and โknowledgeโ mode
Knowledge mode:
- when enabled, the chat will automatically interact with the knowledge base to:
- enrich answers with additional information from the knowledge base
- automatically (or ask the user) update the knowledge base with information from the chat
Example (explain):
Prompt: โexplain โไปๆฅใๅคงๅญฆใซ่กใฃใใโโ Response:
**TL;DR:**
The sentence _ไปๆฅใๅคงๅญฆใซ่กใฃใ_ means **"I went to university today."**
**Knowledge:**
- **[[ไปๆฅ๏ผใใใ๏ผ]]** = today
- **[[ๅคงๅญฆ๏ผใ ใใใ๏ผ]]** = university
- **[[ใซ]]** = particle indicating direction or destination ("to")
- **[[่กใฃใ๏ผใใฃใ๏ผ]]** = past tense of _่กใ๏ผใใ๏ผ_, meaning "to go"
Note: [[]]
marks a knowledge here; in the real response it should probably be some html element that contains all the information required to either retrieve the info of it or create it
Furthermore, the response allows the following actions:
- you can click on words or grammar points (automatically highlighted and recognized as such) to save them in your knowledge base (or if they already exist, you open them up)
Example (write):
Prompt: โhow to say โI went to university todayโ?โ Response:
**TL;DR** โ The natural way to say โI went to university todayโ in Japanese is:
> **ไปๆฅใฏๅคงๅญฆใซ่กใใพใใใ**
> *Kyล wa daigaku ni ikimashita.*
Below is a quick breakdown in โTranslateโ format.
**Full Translation**
| ๆ | ่ชญใฟ | ๆๅณ |
| ---------------- | ------------------ | --------------------------- |
| **ไปๆฅใฏๅคงๅญฆใซ่กใใพใใใ** | ใใใ ใฏ ใ ใใใ ใซ ใใใพใใ | I went to university today. |
**Knowledge**
| ่ฆ็ด | ่ชฌๆ |
| --------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **[[ไปๆฅใฏ]]** | Sets โtodayโ (ใใใ) as the topic with **ใฏ**. It frames the sentence as โAs for todayโฆโ. |
| **[[ๅคงๅญฆ]]** | โUniversity.โ |
| **[[ใซ]]** | Target/destination particle. You can also use **ใธ** (e.g. ไปๆฅใฏๅคงๅญฆใธ่กใใพใใ) โ nuance difference is tiny; **ใซ** feels a bit more factual, **ใธ** a bit more directional. |
| **[[่กใใพใใ]]** | Past-polite form of ่กใ *iku* โto go.โ<br>- Dictionary: ่กใ<br>- Stem: ่กใ<br>- +ใพใใ (polite past) |
The answer is generated preferably on grammar/words the user already knows); Furthermore it may also suggest better solution if appropriate. All these knowledge items are also clickable like in the first example.
Knowledge Page
A table, containing all list of all knowledge types as spoiler items (title = type)
when expanding a spoiler, it shows a table with all the knowledge items of that type (one col per property)
For now, the knowledge page is readonly.
How?
Data Schema
Knowledge is stored in a table with fields:
- id
- type (= user defined enum)
- content (jsonb, which fits schema of the knowledge type)
knowledge types (user defined)
- id
- name
- description (what it is in natural language)
- definition (json schema definition)
- e.g. โjapanese grammarโ,
{ grammar: { type: string, description: 'the grammatical concept in japanese' }, example: { type: string, description: 'an example application of the grammatical concept' }, explanation: { type: string, description: 'explanation of how to use the grammatical concept'} }
- e.g. โjapanese grammarโ,
chats & messages are also stored in separate tables
Embeddings
Knowledge (from knowledge table) can also be queried in natural language (use postgres embeddings directly on the table or alternatively another vector db with embeddings)
How updating knowledge works
When โknowledge modeโ is enabled (and works as explained in the examples), the chat has access to new tools provided by an (internal) mcp server; it exposes:
getKnowledgeTypes
-> returns all knowledge types + their descriptions- using this information, the agent can then decide whether to look up knowledge
getKnowledge(queries[])
- query:
- searchTerm: natural language search
- knowledgeType?: optional knowledge type that must match
- -> returns all knowledge items that match (via vector search on content of knowledge items)
- query:
addKnowledge(knowledge[])
- function to add new knowledge items
Furthermore, the system prompt is adjusted to tell the agent that it should use these tools to enrich answers as e.g. described for the 2 use cases before (however, keep it generalized)
General Architecture
- postgres DB by supabase (+ auth etc)
- backend: typescript server (nextjs?!)
- uses supabase types
- supabase is configured as far as possible via code
- migrations are declaratively managed in the repo as well
- frontend: typescript react app
- shadcn components
- state management via zustand