llm - Concepts
Explore concepts tagged with "llm"
Total concepts: 4
Concepts
- Context Window - The maximum number of tokens an LLM can process in a single interaction, determining how much information it can consider when generating responses.
- Prompt Lazy Loading - An AI design pattern that defers loading detailed prompt instructions until they are actually needed.
- Receptionist AI Design Pattern - An AI architecture pattern using a lightweight coordinator to route requests to specialized AI agents.
- RAG Pipelines - Data processing workflows that handle the end-to-end flow from document ingestion to LLM response generation in Retrieval-Augmented Generation systems.
← Back to all concepts