information-theory - Concepts
Explore concepts tagged with "information-theory"
Total concepts: 4
Concepts
- Context Entropy - Natural tendency of AI context systems to degrade toward disorder over time, accumulating contradictions, redundancies, and noise until usefulness declines.
- Grossman-Stiglitz Paradox - The paradox that if markets are informationally efficient, there is no incentive to gather information, which undermines that efficiency.
- Information Compression - The process of condensing information into its most essential form while preserving meaning, enabling faster processing and better retention.
- Context Signal-to-Noise Ratio - Proportion of task-relevant versus irrelevant information in an AI agent's context window, serving as the core metric that context engineering optimizes.
← Back to all concepts