Context Window
The AI model's "working memory" — how much information it can process at once (e.g., 1 million tokens for Claude Opus 4.6).
The context window determines how much of your codebase, documents, or conversation history an agent can see at once. Larger windows mean agents can work with more context.
Claude Opus 4.6's 1M token context window lets Claude Code understand entire codebases in a single session — reading multiple files, understanding architecture, and making coherent changes.
Related terms
Token
The unit of text an AI model processes. Approximately 3/4 of a word.
Context Rot
Degradation of model accuracy as the context window fills with too much information.
Auto-Compaction
Automatic summarization of older context when the context window fills up, preserving recent information.
LLM (Large Language Model)
A large language model like Claude, GPT, or Gemini. The "brain" that understands and generates language.