
Tech Glossary
Least Recently Used (LRU) Cache
Least Recently Used (LRU) Cache is a caching strategy that manages limited storage by discarding the least recently accessed items first when the cache reaches its capacity. The underlying principle is that data not accessed for the longest time is less likely to be needed soon.
How LRU Cache Works
Accessing Items: When an item is accessed, it's marked as recently used.
Adding New Items: If the cache is full, the least recently used item is removed to make space for the new item.
Updating Usage: Each access updates the usage history, ensuring the most recently used items are retained.
Implementation Details
An efficient LRU Cache can be implemented using a combination of a hash map and a doubly linked list:
Hash Map: Provides O(1) time complexity for accessing cache items.
Doubly Linked List: Maintains the order of item usage, with the most recently used items at the front and the least recently used at the back.
This combination allows for quick updates and retrievals, maintaining optimal performance.
Use Cases
Operating Systems: Managing memory pages to optimize performance.
Web Browsers: Caching web pages to reduce load times.
Databases: Caching query results for faster data retrieval.
Content Delivery Networks (CDNs): Serving frequently accessed content efficiently.
Comparison with Other Caching Strategies
Least Frequently Used (LFU): Evicts items accessed least often, regardless of recency.
First-In, First-Out (FIFO): Removes the oldest items added to the cache, without considering usage frequency or recency.
LRU offers a balance between recency and frequency, making it suitable for many applications.
Learn more about Least Recently Used (LRU) Cache