Tokenization - Graph View Breaking text into smaller units (tokens) that AI models can process. View concept details Related ConceptsLarge Language Models (LLMs) Context Window Embedding Token Next-Token Prediction Natural Language Processing ← Back to full graph