Information processing theory is a cognitive framework that conceptualizes the human mind as a system that receives, encodes, stores, and retrieves information—much like a computer processes data. Emerging during the cognitive revolution of the 1950s and 1960s, this approach fundamentally changed how psychologists think about mental processes, shifting the focus from observable behavior (as in behaviorism) to the internal mechanisms of thought.
## Origins in the Cognitive Revolution
Information processing theory arose alongside the development of digital computers and communication theory in the mid-20th century. Pioneers such as George Miller, Herbert Simon, Allen Newell, and Ulric Neisser drew explicit analogies between human cognition and computer operations. The publication of Miller's landmark 1956 paper "The Magical Number Seven, Plus or Minus Two" highlighted the limited capacity of short-term memory and helped establish the information processing paradigm. Claude Shannon's information theory and Norbert Wiener's cybernetics provided formal tools for thinking about how systems transmit, encode, and decode information.
## The Atkinson-Shiffrin Multi-Store Model
The most influential early model of information processing was proposed by Richard Atkinson and Richard Shiffrin in 1968. Their multi-store model describes three distinct memory stores through which information flows sequentially:
- **Sensory memory** briefly holds raw sensory input (iconic memory for vision, echoic memory for audition) for fractions of a second to a few seconds. Only information that receives attention moves to the next stage.
- **Short-term memory (STM)** holds a limited amount of information (roughly 7 items) for approximately 15-30 seconds. Information in STM can be maintained through rehearsal and may be transferred to long-term storage.
- **Long-term memory (LTM)** has virtually unlimited capacity and duration. Information stored here can be retrieved when needed, though retrieval is not always successful.
This sequential flow—from sensory input through short-term processing to long-term storage—became the canonical model of human information processing.
## Levels of Processing: An Alternative View
In 1972, Fergus Craik and Robert Lockhart proposed an influential alternative to the multi-store model. Their levels of processing framework argued that memory is not determined by which store information reaches, but by the depth at which it is processed. Shallow processing (attending to physical features like font or sound) produces weak, short-lived memories, while deep processing (attending to meaning, making associations, elaborating on content) produces strong, durable memories. This framework shifted attention from structural stores to the quality of encoding processes.
## Baddeley's Working Memory Model
Alan Baddeley and Graham Hitch refined the concept of short-term memory in 1974 with their working memory model. Rather than a single short-term store, they proposed a multi-component system:
- The **central executive** directs attention and coordinates the other components.
- The **phonological loop** processes and maintains verbal and acoustic information through subvocal rehearsal.
- The **visuospatial sketchpad** handles visual and spatial information.
- The **episodic buffer** (added by Baddeley in 2000) integrates information from the other components and links it with long-term memory.
This model better accounts for the active manipulation of information during complex cognitive tasks like reasoning, comprehension, and problem-solving.
## Information Processing Stages
Regardless of the specific model, information processing theory identifies three fundamental stages:
- **Encoding** is the process of transforming sensory input into a mental representation. Encoding can be automatic or effortful, and the depth and type of encoding strongly influence later retrieval.
- **Storage** is the maintenance of encoded information over time. Information may be stored in different forms (visual, acoustic, semantic) and is organized through schemas, categories, and associative networks.
- **Retrieval** is the process of accessing stored information when needed. Retrieval can be aided by cues, context, and the match between encoding and retrieval conditions.
## Strengths of the Computer Metaphor
The computer metaphor provided cognitive psychology with a powerful vocabulary and set of conceptual tools. It made internal mental processes scientifically tractable by offering precise, testable models. It inspired productive research programs on attention, memory, problem-solving, and language processing. The metaphor also facilitated productive exchange between psychology and computer science, contributing to the development of artificial intelligence and cognitive modeling.
## Limitations and Critiques
Despite its enormous influence, information processing theory has faced significant critiques:
- **The embodied cognition challenge**: Critics argue that cognition is not purely abstract symbol manipulation but is deeply shaped by the body and its interactions with the environment. Thinking is not just "in the head" but involves bodily states, gestures, and environmental scaffolding.
- **Neglect of emotion**: The computer metaphor tends to treat cognition as cold and rational, underemphasizing the pervasive influence of emotions, motivation, and social context on all stages of information processing.
- **Serial vs. parallel processing**: Early models assumed largely serial (step-by-step) processing, while the brain actually performs massive parallel computation.
- **Ecological validity**: Laboratory-based information processing research may not capture how cognition works in real-world, context-rich environments.
## Legacy in Cognitive Load Theory and Instructional Design
One of the most practically important legacies of information processing theory is cognitive load theory, developed by John Sweller in the 1980s. By taking seriously the limited capacity of working memory and the role of schemas in long-term memory, cognitive load theory provides evidence-based principles for designing effective instruction. It distinguishes intrinsic load (inherent difficulty of the material), extraneous load (unnecessary processing caused by poor design), and germane load (processing that contributes to schema construction), guiding educators to optimize learning by managing these demands.