Context Rot
Degradation of model accuracy as the context window fills with too much information.
Why it matters
More context isn't always better. When you stuff too much into the context window, the model starts losing track of what matters.
In practice
This is why Claude Code uses auto-compaction: instead of letting the context fill up and degrade, it compresses older information to keep the most relevant details sharp.
Related terms
Context Window
The AI model's "working memory" — how much information it can process at once (e.g., 1 million tokens for Claude Opus 4.6).
Auto-Compaction
Automatic summarization of older context when the context window fills up, preserving recent information.
Context Engineering
The practice of shaping and managing information fed to an AI model for optimal results.