- Selfish-LRU: Preemption-Aware Caching for Predictability and Performance
- Real-Time and Embedded Technology and Applications Symposium
- Book/source title
- 2014 IEEE 20th Real-Time and Embedded Technology and Applications Symposium (RTAS)
- Pages (from-to)
- Piscataway, NJ: IEEE
- Document type
- Conference contribution
- Faculty of Science (FNWI)
- Informatics Institute (IVI)
We introduce Selfish-LRU, a variant of the LRU (least recently used) cache replacement policy that improves performance and predictability in preemptive scheduling scenarios. In multitasking systems with conventional caches, a single memory access by a preempting task can trigger a chain reaction leading to a large number of additional cache misses in the preempted task. Selfish-LRU prevents such chain reactions by first evicting cache blocks that do not belong to the currently active task. Simulations confirm that Selfish-LRU reduces the CRPD (cache-related preemption delay) as well as the overall number of cache misses. At the same time, it simplifies CRPD analysis and results in smaller CRPD bounds.
- go to publisher's site
If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library, or send a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible.