LFU (Least Frequently Used) is one of the cache algorithms (also called cache replacement algorithms or policies). When the cache is full, and a new element needs to be added, this algorithm evicts/purges the least frequently used element to free up room for adding the new one.
Cache has two operations:
Calling any one of these operations for a key increment that key's usage count by one.
Once a key-value pair is evicted from the cache, the key's usage count is reset/forgotten as if it was never in the cache.
Learn how LRU Cache works.
Given a capacity and a set of operations to perform on an LFU cache, return results of those operations.
Each operation will be given as either two or three numbers:
The output contains results of all get operations, in order:
The structure of the class may look like this:
We hope that this solution to design and implement an LFU Cache problem will help you level up your Heap and Map coding skills. Companies such as Amazon, Adobe, Citigroup, Microsoft, and Google include Maximum Depth/Height of a Binary Tree interview questions in their tech interviews.
If you are preparing for a tech interview at FAANG or any other Tier-1 tech company, register for Interview Kickstart’s FREE webinar to understand the best way to prepare.
Interview Kickstart offers interview preparation courses taught by FAANG+ tech leads and seasoned hiring managers. Our programs include a comprehensive curriculum, unmatched teaching methods, FAANG+ instructors, and career coaching to help you nail your next tech interview.
We offer 17 interview preparation courses, each tailored to a specific engineering domain or role, including the most in-demand and highest-paying domains and roles, such as:
To learn more, register for the FREE webinar.