Please note that this is about a thread-safe LRU cache (not simply a LRU cache as specified in https://leetcode.com/problems/lru-cache/description/). It isn't a duplicate of LRU cache design question as there are some tricky aspects of Locking Hashtable/Linkedlist(LL) that aren't addressed in other multithreaded LRU design questions.
The credited approach on how to make LRU cache thread-safe in C++ seems to be all over the place. I noticed some links mentioning that locking both Hashtable/LL is ok while some others don't directly address the issue in C++ and just vaguely discuss various ways to do it.
Can someone describe a functional and efficient approach using locks that includes what's locked and when implementing a multithreaded version of the LRU cache in C++ https://leetcode.com/problems/lru-cache/description/ so all get/set operations are O(1) ?
how do caches such as memcached by Facebook implement multithreading support?
You can find an open-source C++ implementation of a hash-table/linked list based LRU Cache LRUCache.h and LRUCache.inl (note - the 'apparent locking' in this implementation isn't real - its just assertions to detect errant unsafe thread use - for code that is externally synchronized).
And a small test case (Test.cpp#L721for the synchronization wrapper - there are other tests for the LRUCache itself).
Note - even if you use a different set of utility classes to implement your LRUCache, the implementation of SynchronizedLRUCache makes clear how to implement synchronization using shared_lock/lock_guard.