Generate Knowledge
Think before answering
The Problem: Sometimes AI needs background knowledge to answer well, but that knowledge isn't in the prompt. How can we help AI access its own relevant knowledge?
The Solution: Brainstorm Before Answering
Generate Knowledge prompting asks AI to first generate relevant facts and context, then use that as background to answer the actual question. It's like brainstorming what you know before tackling a problem. It improves Chain-of-Thought by providing richer context, and unlike RAG, it retrieves knowledge from the model itself.
Think of it like a brainstorming session:
- 1. Question: "Is glass a solid or liquid?"
- 2. Generate knowledge: "Glass is an amorphous solid... molecules don't flow..."
- 3. Use knowledge: Include generated facts in context
- 4. Answer: More accurate response with proper background
Where Is This Used?
- Science Questions: Generating relevant scientific background
- Common Sense Reasoning: Making implicit knowledge explicit
- Trivia/Knowledge Tasks: Recalling relevant facts first
- Writing Tasks: Generating research before drafting
Fun Fact: This technique mimics how humans think! We often "warm up" our memory by recalling related facts before answering. Studies show this improves accuracy on common-sense reasoning tasks by 5-10%.
Try It Yourself!
Use the interactive example below to see how generating knowledge first leads to more informed and accurate answers.
Generate Knowledge Prompting
First generate relevant knowledge, then use it to answer
Part of golf is trying to get a higher point total than others. Yes or No?
How It Works
When to Use Generate Knowledge
- • Questions requiring specific facts
- • Commonsense reasoning tasks
- • When direct answers might be inaccurate
- • To improve accuracy without external sources
Generate 3-5 facts that would help answer the question:
"{question}"
Facts should be relevant and accurate.Using the following knowledge:
{generated_knowledge}
Answer the question: "{question}"The technique is described in the paper "Generated Knowledge Prompting for Commonsense Reasoning" (Liu et al., 2022). Research showed 3-5% improvement on commonsense benchmarks compared to direct answers.
| Aspect | Direct Answer | Generate Knowledge | RAG |
|---|---|---|---|
| Knowledge Source | Within model | Within model (explicit) | External documents |
| Infrastructure | Not needed | Not needed | Vector DB |
| Recency | Up to training date | Up to training date | Current |
| Accuracy | Basic | Improved (+3-5%) | High |
Try it yourself
Interactive demo of this technique
Answer the question: is glass a liquid or a solid?
Glass is a liquid that flows very slowly. That is why old window panes are thicker at the bottom.
Let me first generate key facts:
- Glass is an amorphous solid — its molecules do not form a crystalline lattice.
- When a melt cools, glass undergoes vitrification — a transition to a solid state without crystallization.
- The myth of "flowing glass" has been debunked: uneven thickness in old panes is a result of manufacturing techniques, not flow.
- The viscosity of glass at room temperature is so high that noticeable flow would take longer than the age of the universe.
Answer: Glass is an amorphous solid. The common myth that glass is a slowly flowing liquid is not supported by science.
Without knowledge generation, the model reproduced a popular myth. Pre-generating facts activated accurate scientific knowledge and helped filter out misconceptions.
Create a free account to solve challenges
6 AI-verified challenges for this lesson
This lesson is part of a structured LLM course.
My Learning Path