AI-101

Context Window

The maximum amount of text an AI model can consider at once, measured in tokens.

technicallimitations
AI Confidence: 85%

AI-generated

What It Means

The context window is the total amount of text a model can process in a single interaction, including both your input and the model's output. Early models had 4K token windows (about 3,000 words). Modern models range from 128K to over 1M tokens.

Why It Matters

Context window size determines what you can do. A small window means you cannot paste a full codebase or long document. A large window lets you have extended conversations, analyze entire codebases, or process book-length documents. Understanding context windows helps you choose the right model and structure your prompts effectively.

Sources & Further Reading

Anthropic: Claude model context windows - https://docs.anthropic.com/en/docs/about-claude/models

OpenAI: Model token limits - https://platform.openai.com/docs/models