tokens - Concepts
Explore concepts tagged with "tokens"
Total concepts: 3
Concepts
- Token - A fundamental unit of text that language models process, typically representing a word, subword, or character.
- Text Generation - The process by which language models produce coherent text by predicting and outputting sequences of tokens.
- Context Window - The maximum number of tokens an LLM can process in a single interaction, determining how much information it can consider when generating responses.
← Back to all concepts