LLDintermediate
LRU Cache Design
LRU cache is a classic design problem that combines data structures with object-oriented thinking. A strong answer explains why hashmap plus doubly linked list gives O(1) lookups and updates.
Reading time
10 min
lru cachedata structurescachelld
Problem Statement
Design a cache that evicts the least recently used item when capacity is reached.
Standard Solution
- Hashmap for O(1) lookup by key
- Doubly linked list for O(1) recency updates and eviction
Key Operations
get(key)returns the value and moves the entry to the frontput(key, value)inserts or updates, and evicts the tail when full
Interview Tip
Always state time complexity clearly. That is usually the primary evaluation point.