from collections import OrderedDict
class LRUCache:
def __init__(self, capacity: int):
self.cache = OrderedDict()
self.capacity = capacity
def get(self, key: int) -> int:
if key not in self.cache:
return -1
self.cache.move_to_end(key)
return self.cache[key]
def put(self, key: int, value: int) -> None:
if key in self.cache:
self.cache.move_to_end(key)
self.cache[key] = value
if len(self.cache) > self.capacity:
self.cache.popitem(last=False)
#include <unordered_map>
#include <list>
class LRUCache {
public:
LRUCache(int capacity) : cap(capacity) {}
int get(int key) {
if (cache.find(key) == cache.end())
return -1;
// Move the accessed node to the front
items.splice(items.begin(), items, cache[key]);
return cache[key]->second;
}
void put(int key, int value) {
if (cache.find(key) != cache.end()) {
// Update value and move to front
cache[key]->second = value;
items.splice(items.begin(), items, cache[key]);
} else {
if (items.size() == cap) {
int old_key = items.back().first;
items.pop_back();
cache.erase(old_key);
}
items.emplace_front(key, value);
cache[key] = items.begin();
}
}
private:
int cap;
std::list<std::pair<int, int>> items;
std::unordered_map<int, std::list<std::pair<int, int>>::iterator> cache;
};
import java.util.*;
class LRUCache {
private final int capacity;
private final LinkedHashMap<Integer, Integer> cache;
public LRUCache(int capacity) {
this.capacity = capacity;
this.cache = new LinkedHashMap<Integer, Integer>(capacity, 0.75f, true) {
protected boolean removeEldestEntry(Map.Entry eldest) {
return size() > LRUCache.this.capacity;
}
};
}
public int get(int key) {
return cache.getOrDefault(key, -1);
}
public void put(int key, int value) {
cache.put(key, value);
}
}
class LRUCache {
constructor(capacity) {
this.capacity = capacity;
this.cache = new Map();
}
get(key) {
if (!this.cache.has(key)) return -1;
const value = this.cache.get(key);
this.cache.delete(key);
this.cache.set(key, value);
return value;
}
put(key, value) {
if (this.cache.has(key)) {
this.cache.delete(key);
}
this.cache.set(key, value);
if (this.cache.size > this.capacity) {
// Remove least recently used
const oldestKey = this.cache.keys().next().value;
this.cache.delete(oldestKey);
}
}
}
The LRU (Least Recently Used) Cache problem asks you to design a data structure that supports two operations:
get(key)
: Returns the value of the key
if it exists in the cache, otherwise returns -1
.put(key, value)
: Inserts or updates the value of the key
. If the cache exceeds its capacity, it should invalidate the least recently used item before inserting a new item.capacity
, and both operations should run in O(1) average time complexity.
The "least recently used" means that when the cache is full and a new item needs to be inserted, the item that has not been accessed for the longest time is removed.
At first glance, you might think of using a simple dictionary or hash map to store the keys and values. This works well for get
and put
in terms of fast lookup and update. However, the challenge is to keep track of the usage order so that we can efficiently evict the least recently used item when the cache exceeds its capacity.
A brute-force approach would be to scan all keys to find the least recently used one whenever an eviction is needed, but this would be slow (O(n) time). We need a way to both:
To achieve O(1) time for both operations, we use:
Many languages provide built-in ordered dictionaries or linked hash maps (like OrderedDict
in Python, LinkedHashMap
in Java, and Map
in JavaScript with insertion order), which can simplify the implementation.
Let's walk through an example with capacity = 2
:
put(1, 1)
: Cache is {1=1}put(2, 2)
: Cache is {1=1, 2=2}get(1)
: Returns 1. Cache order is now {2=2, 1=1} (1 is most recently used)put(3, 3)
: Cache is full, evicts key 2 (least recently used). Cache is now {1=1, 3=3}get(2)
: Returns -1 (not found)put(4, 4)
: Cache is full, evicts key 1. Cache is now {3=3, 4=4}get(1)
: Returns -1 (not found)get(3)
: Returns 3. Cache order is {4=4, 3=3}get(4)
: Returns 4. Cache order is {3=3, 4=4}Brute-force approach:
get
or put
if we have to search for the least recently used key by scanning all items.capacity
items.get
and put
, because both hash map lookups and linked list operations are constant time.capacity
key-value pairs and their nodes.The LRU Cache problem is a classic example of combining data structures—hash maps for fast lookup and doubly linked lists for fast order management—to achieve constant time operations. The key insight is realizing that no single data structure can efficiently manage both fast access and fast eviction order, but together they provide an elegant, efficient solution. This pattern is common in systems that need to manage limited resources, such as memory caches and page replacement algorithms.