Question
How can I implement a Least Recently Used (LRU) cache in Java using Generics and achieve O(1) operations?
// Sample code for an LRU Cache in Java with Generics
import java.util.*;
public class LRUCache<K, V> {
private final int capacity;
private final Map<K, Node<K, V>> cache;
private final DoublyLinkedList<K, V> list;
public LRUCache(int capacity) {
this.capacity = capacity;
this.cache = new HashMap<>();
this.list = new DoublyLinkedList<>();
}
public V get(K key) {
if (!cache.containsKey(key)) return null;
Node<K, V> node = cache.get(key);
list.moveToFront(node);
return node.value;
}
public void put(K key, V value) {
if (cache.containsKey(key)) {
Node<K, V> node = cache.get(key);
node.value = value;
list.moveToFront(node);
} else {
if (cache.size() == capacity) {
Node<K, V> last = list.removeLast();
cache.remove(last.key);
}
Node<K, V> newNode = list.addFront(key, value);
cache.put(key, newNode);
}
}
}
class Node<K, V> {
K key;
V value;
Node<K, V> prev, next;
Node(K key, V value) { this.key = key; this.value = value; }
}
class DoublyLinkedList<K, V> {
private Node<K, V> head, tail;
public DoublyLinkedList() {
head = new Node<>(null, null);
tail = new Node<>(null, null);
head.next = tail;
tail.prev = head;
}
public Node<K, V> addFront(K key, V value) {
Node<K, V> newNode = new Node<>(key, value);
newNode.next = head.next;
newNode.prev = head;
head.next.prev = newNode;
head.next = newNode;
return newNode;
}
public void moveToFront(Node<K, V> node) {
remove(node);
addFront(node.key, node.value);
}
public Node<K, V> removeLast() {
Node<K, V> toRemove = tail.prev;
remove(toRemove);
return toRemove;
}
private void remove(Node<K, V> node) {
node.prev.next = node.next;
node.next.prev = node.prev;
}
}
Answer
The Least Recently Used (LRU) cache is a classic data structure essential for optimizing space and performance in memory management. This implementation ensures that both retrieval and insertion operations occur in O(1) time complexity, making it highly efficient for real-world applications.
// See the provided code snippet for a complete implementation of LRU Cache.
Causes
- LRU cache helps manage data efficiently by discarding the least used items when space is needed.
- In applications where memory and speed are vital, managing cache with O(1) operations prevents bottlenecks.
Solutions
- Utilize a HashMap to store cache entries indexed by key for O(1) retrieval.
- Implement a Doubly Linked List to track the order of use, allowing easy addition to the front and removal from the end with O(1) complexity.
- Combine both data structures to maintain a quick access list and a fast lookup.
Common Mistakes
Mistake: Not managing the capacity correctly, leading to memory leaks.
Solution: Always check the capacity before adding a new element and remove the least used correctly.
Mistake: Inefficient retrieval methods that lead to O(n) complexity.
Solution: Ensure the get() method updates the usage order effectively.
Helpers
- LRU cache in Java
- Java Generics
- O(1) cache operations
- Least Recently Used cache
- Java data structures
- efficient caching techniques