Generated Knowledge Prompting
A two-step technique where the AI first generates relevant background knowledge, then uses that knowledge to answer the question.
Also known as: Knowledge Generation Prompting, Two-Stage Prompting, Self-Generated Context
Category: Techniques
Tags: ai, prompting, reasoning, llm-techniques, knowledge-management
Explanation
Generated Knowledge Prompting is a technique that improves AI reasoning by first asking the model to generate relevant facts or background knowledge about a topic, then using that generated knowledge as context for answering the actual question.
The two-step process:
1. **Knowledge Generation**: "Generate some facts about [topic] that would be useful for answering questions about it."
2. **Knowledge Integration**: "Given the following knowledge: [generated facts], answer this question: [actual question]"
Why it works:
- Activates relevant information from the model's training
- Provides explicit context that might otherwise be implicit
- Reduces hallucination by grounding responses in generated facts
- Encourages systematic thinking before answering
Example:
```
Step 1: "Generate 5 key facts about photosynthesis."
→ Facts about light reactions, chlorophyll, CO2 absorption, etc.
Step 2: "Using the facts above, explain why plants are green."
→ More accurate, grounded response
```
Best applications:
- Commonsense reasoning tasks
- Questions requiring domain knowledge
- Educational explanations
- Fact-based analysis
Variations:
- Generate knowledge from multiple perspectives
- Generate counterarguments before deciding
- Generate examples before abstracting principles
This technique can be combined with self-consistency by generating multiple knowledge sets and selecting the best answer.
Related Concepts
← Back to all concepts