-
Notifications
You must be signed in to change notification settings - Fork 15
Use the cache directly without a decorator #13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I have 16 different caches and I need to keep them separate so that overflowing cache 3 does not pop out values from cache 1 and so on |
I was able to come up with a hacky way of handling this, but a natural implementation would still be welcome. pseudo code: def __init__(self):
self.caches = []
for _ in range(16):
new_cache = mem.cached(custom_key_maker=self._custom_keys)(self.get_next_state)
self.caches.append(new_cache)
def get_next_state(self, arg1, arg2, arg3, use_cache=True):
if use_cache:
cache_idx = get_cache_id(arg1)
cache = self.caches[cache_idx]
return cache(arg1, arg2, arg3) This creates a once-recursive call that utilizes a cache from a list of caches, had to also leverage |
Yes, upvote for sure. Any sophisticated caching usage quickly escape the boundaries imposed by decorators. |
I've a need to get/set values from an LFU cache directly, rather than as a function decorator. The need is as such:
This pattern of having multiple caches and only knowing which one to leverage inside the function that is to be cached means I cannot use a decorator.
How can I access
memoization
caches directly?The text was updated successfully, but these errors were encountered: