Least Recently Used (LRU) Algorithm

Introduction to Page Replacement Algorithms

Page replacement algorithms are essential components of operating systems that manage the allocation and deallocation of memory pages.

These algorithms decide which pages to keep in the main memory and which to evict when new pages need to be loaded. The efficiency of these algorithms significantly impacts system performance.

Least Recently Used (LRU)

Understanding the LRU Algorithm

The Least Recently Used (LRU) algorithm is a popular page replacement strategy that operates on the principle of locality of reference. It assumes that if a page is referenced recently, it is likely to be referenced again in the near future. 

To implement the LRU algorithm, the system maintains a list or a queue of pages in the main memory. When a page needs to be replaced, the algorithm selects the page that has the earliest access time, indicating that it has been the least recently used. This selected page is then swapped out with the new page.

Implementing LRU Algorithm in Operating Systems

Operating systems often implement the LRU algorithm to manage virtual memory and disk caching. The LRU algorithm can be implemented using various data structures such as linked lists, hash tables, or priority queues.

These data structures enable efficient tracking of page access times and facilitate the selection of the least recently used page for replacement.

LRU Algorithm:

1. Initialize an empty cache of size N (number of frames) and an empty queue (used to keep track of the order of page references).

2. While processing page references:
   a. Read the next page reference.
   b. If the page is present in the cache (cache hit), move the corresponding entry to the front of the queue.
   c. If the page is not present in the cache (cache miss):
      i. If the cache is full (all frames are occupied):
         1. Remove the least recently used page (the page at the end of the queue).
         2. Add the new page to the cache and insert it at the front of the queue.
      ii. If the cache has an empty frame:
         1. Add the new page to the cache and insert it at the front of the queue.
   d. Repeat steps (a) to (c) until all page references are processed.

3. At the end of processing page references, the cache will contain the most frequently used pages.

Example of Least Recently Used Page Replacement Algorithm

  • Let’s consider a simple example to illustrate how the LRU algorithm works.
  • Suppose we have a main memory that can hold only three pages: A, B, and C.
  • The memory initially contains pages A and B.
  • Now, when a new page D is requested, the LRU algorithm determines that page C, which hasn’t been accessed for the longest time, should be replaced with page D.
Initial Memory State: A B _
Memory State after Requesting D: A B D
Least Recently Used (LRU) Algorithm

Implemtation of Least Recently Used (LRU) Algorithm


It creates a cache object with a capacity of 2 and performs a series of put and get operations on it.

  • First, it puts key-value pairs (1, 1) and (2, 2) into the cache.
  • Then, it retrieves the value associated with key 1 using the get() method, which returns 1.
  • Next, it puts a new key-value pair (3, 3) into the cache, which exceeds the capacity.
  • When it tries to retrieve the value associated with key 2, it returns -1 since that entry was evicted due to reaching the cache’s capacity.
  • It puts a new key-value pair (4, 4) into the cache, again exceeding the capacity and evicting the least recently used entry.
  • Finally, it retrieves the values associated with keys 1, 3, and 4, which return -1, 3, and 4 respectively.
  • The provided code implements the Least Recently Used (LRU) cache algorithm in Java.
  • The LRUCache class uses a combination of a HashMap and a doubly linked list to store key-value pairs.
  • The get() method retrieves the value associated with a key and moves the corresponding node to the front of the list.
  • The put() method inserts or updates a key-value pair, removing the least recently used item if the cache is full.
  • The code demonstrates the usage of the LRUCache class with example put and get operations.
  • In the example usage, a cache object is created with a capacity of 2. Key-value pairs are added using the put() method, and values are retrieved using the get() method. The output of the get() method for each key is printed.

  • The output of the program will be:

    • 1 (since the value associated with key 1 is present)
    • -1 (since key 2 was evicted from the cache)
    • -1 (since key 1 was evicted from the cache)
    • 3 (the value associated with key 3)
    • 4 (the value associated with key 4)

The code demonstrates how the LRU cache implementation using OrderedDict in Python ensures that the most recently used items are moved to the end of the cache, while evicting the least recently used items when the cache exceeds its capacity.

The LRU algorithm offers several benefits and finds applications in various systems:

  • Efficient Cache Management: The LRU algorithm is commonly used in CPU caches to improve cache hit rates and reduce memory latency.

  • Database Management Systems: LRU is used in database systems to manage the buffer pool, ensuring that frequently accessed data remains in memory.

  • Web Caching: LRU is employed in web caching mechanisms to store and retrieve frequently accessed web pages, enhancing browsing speed.

While the LRU algorithm offers significant advantages, it also faces certain challenges and limitations:

  1. Implementation Complexity: Implementing the LRU algorithm efficiently requires careful design and consideration of data structures, which can be complex.

  2. High Overhead: Tracking page access times and maintaining the LRU list incurs additional overhead, which can impact overall system performance.

  3. Unequal Access Patterns: In scenarios where page access patterns deviate significantly from the assumption of locality, the LRU algorithm may not perform optimally.

Comparing LRU with Most Recently Used Algorithm

  • In contrast to the LRU algorithm, the Most Recently Used (MRU) algorithm replaces the page that has been accessed most recently.
  • While LRU focuses on discarding the least recently used page, MRU prioritizes the most recently used page for replacement.
  • Both LRU and MRU have their advantages and use cases. LRU tends to perform better in scenarios where the locality of reference is high, as it maximizes the cache hit rate by keeping frequently accessed pages in memory.


The Least Recently Used (LRU) algorithm is a powerful page replacement strategy that optimizes memory management in operating systems.

By evicting the least recently used page, it improves cache hit rates and enhances system performance. Understanding the LRU algorithm’s principles, benefits, and limitations is crucial for system designers and software developers.

Prime Course Trailer

Related Banners

Get PrepInsta Prime & get Access to all 200+ courses offered by PrepInsta in One Subscription

Get over 200+ course One Subscription

Courses like AI/ML, Cloud Computing, Ethical Hacking, C, C++, Java, Python, DSA (All Languages), Competitive Coding (All Languages), TCS, Infosys, Wipro, Amazon, DBMS, SQL and others

Checkout list of all the video courses in PrepInsta Prime Subscription

Checkout list of all the video courses in PrepInsta Prime Subscription