Want Help Cracking FAANG?

(Then click this)

×
Back to Question Bank

146. LRU Cache - Leetcode Solution

Code Implementation

from collections import OrderedDict

class LRUCache:

    def __init__(self, capacity: int):
        self.cache = OrderedDict()
        self.capacity = capacity

    def get(self, key: int) -> int:
        if key not in self.cache:
            return -1
        self.cache.move_to_end(key)
        return self.cache[key]

    def put(self, key: int, value: int) -> None:
        if key in self.cache:
            self.cache.move_to_end(key)
        self.cache[key] = value
        if len(self.cache) > self.capacity:
            self.cache.popitem(last=False)
      
#include <unordered_map>
#include <list>

class LRUCache {
public:
    LRUCache(int capacity) : cap(capacity) {}

    int get(int key) {
        if (cache.find(key) == cache.end())
            return -1;
        // Move the accessed node to the front
        items.splice(items.begin(), items, cache[key]);
        return cache[key]->second;
    }

    void put(int key, int value) {
        if (cache.find(key) != cache.end()) {
            // Update value and move to front
            cache[key]->second = value;
            items.splice(items.begin(), items, cache[key]);
        } else {
            if (items.size() == cap) {
                int old_key = items.back().first;
                items.pop_back();
                cache.erase(old_key);
            }
            items.emplace_front(key, value);
            cache[key] = items.begin();
        }
    }
private:
    int cap;
    std::list<std::pair<int, int>> items;
    std::unordered_map<int, std::list<std::pair<int, int>>::iterator> cache;
};
      
import java.util.*;

class LRUCache {

    private final int capacity;
    private final LinkedHashMap<Integer, Integer> cache;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        this.cache = new LinkedHashMap<Integer, Integer>(capacity, 0.75f, true) {
            protected boolean removeEldestEntry(Map.Entry eldest) {
                return size() > LRUCache.this.capacity;
            }
        };
    }

    public int get(int key) {
        return cache.getOrDefault(key, -1);
    }

    public void put(int key, int value) {
        cache.put(key, value);
    }
}
      
class LRUCache {
    constructor(capacity) {
        this.capacity = capacity;
        this.cache = new Map();
    }

    get(key) {
        if (!this.cache.has(key)) return -1;
        const value = this.cache.get(key);
        this.cache.delete(key);
        this.cache.set(key, value);
        return value;
    }

    put(key, value) {
        if (this.cache.has(key)) {
            this.cache.delete(key);
        }
        this.cache.set(key, value);
        if (this.cache.size > this.capacity) {
            // Remove least recently used
            const oldestKey = this.cache.keys().next().value;
            this.cache.delete(oldestKey);
        }
    }
}
      

Problem Description

The LRU (Least Recently Used) Cache problem asks you to design a data structure that supports two operations:

  • get(key): Returns the value of the key if it exists in the cache, otherwise returns -1.
  • put(key, value): Inserts or updates the value of the key. If the cache exceeds its capacity, it should invalidate the least recently used item before inserting a new item.
The cache must operate with a fixed capacity, and both operations should run in O(1) average time complexity. The "least recently used" means that when the cache is full and a new item needs to be inserted, the item that has not been accessed for the longest time is removed.

Thought Process

At first glance, you might think of using a simple dictionary or hash map to store the keys and values. This works well for get and put in terms of fast lookup and update. However, the challenge is to keep track of the usage order so that we can efficiently evict the least recently used item when the cache exceeds its capacity.

A brute-force approach would be to scan all keys to find the least recently used one whenever an eviction is needed, but this would be slow (O(n) time). We need a way to both:

  • Quickly find and update a key's value (O(1) time)
  • Quickly identify and remove the least recently used key (also O(1) time)
This requires combining two data structures: a hash map for fast access, and a doubly linked list for maintaining the order of usage.

Solution Approach

To achieve O(1) time for both operations, we use:

  • Hash Map (Dictionary): Stores key to node mapping for fast lookup and updates.
  • Doubly Linked List: Maintains the order of access. The most recently used item is moved to the front (head), and the least recently used is at the back (tail). When we need to evict, we remove from the tail.
Here's how each operation works:
  1. get(key): If the key is present, move its node to the head (mark as recently used) and return its value. If not, return -1.
  2. put(key, value): If the key is already present, update its value and move it to the head. If not, add a new node at the head. If the cache exceeds capacity, remove the node at the tail (the least recently used).
This combination allows us to perform all required operations in constant time.

Many languages provide built-in ordered dictionaries or linked hash maps (like OrderedDict in Python, LinkedHashMap in Java, and Map in JavaScript with insertion order), which can simplify the implementation.

Example Walkthrough

Let's walk through an example with capacity = 2:

  • put(1, 1): Cache is {1=1}
  • put(2, 2): Cache is {1=1, 2=2}
  • get(1): Returns 1. Cache order is now {2=2, 1=1} (1 is most recently used)
  • put(3, 3): Cache is full, evicts key 2 (least recently used). Cache is now {1=1, 3=3}
  • get(2): Returns -1 (not found)
  • put(4, 4): Cache is full, evicts key 1. Cache is now {3=3, 4=4}
  • get(1): Returns -1 (not found)
  • get(3): Returns 3. Cache order is {4=4, 3=3}
  • get(4): Returns 4. Cache order is {3=3, 4=4}
At each step, the cache maintains the most recently used items and discards the least recently used when necessary.

Time and Space Complexity

Brute-force approach:

  • Time Complexity: O(n) for each get or put if we have to search for the least recently used key by scanning all items.
  • Space Complexity: O(capacity), since we store up to capacity items.
Optimized approach (Hash Map + Doubly Linked List):
  • Time Complexity: O(1) for both get and put, because both hash map lookups and linked list operations are constant time.
  • Space Complexity: O(capacity), since we store up to capacity key-value pairs and their nodes.
The optimized solution is efficient and meets the problem's requirements.

Summary

The LRU Cache problem is a classic example of combining data structures—hash maps for fast lookup and doubly linked lists for fast order management—to achieve constant time operations. The key insight is realizing that no single data structure can efficiently manage both fast access and fast eviction order, but together they provide an elegant, efficient solution. This pattern is common in systems that need to manage limited resources, such as memory caches and page replacement algorithms.