Skip to content

Possibly a good/better LRFU cache? #3

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Logic-Elliven opened this issue Feb 19, 2022 · 1 comment
Open

Possibly a good/better LRFU cache? #3

Logic-Elliven opened this issue Feb 19, 2022 · 1 comment

Comments

@Logic-Elliven
Copy link

Hi tugrul512bit :)

Again; I'm no dev, so just thought you may find this interesting/useful:

"...The weak-lru-cache package provides a powerful cache that works in harmony with the JS garbage collection (GC) and least-recently used (LRU) and least-freqently used (LFU) expiration strategy to help cache data with highly optimized cache retention.

It uses LRU/LFU (LRFU) expiration to retain referenced data, and then once data has been inactive, it uses weak references (and finalization registry) to allow GC to remove the cached data as part of the normal GC cycles, but still continue to provide cached access to the data as long as it still resides in memory and hasn't been collected.

This provides the best of modern expiration strategies combined with optimal GC interaction..."

https://github.com/kriszyp/weak-lru-cache

@tugrul512bit
Copy link
Owner

LRU is really too naive for advanced tasks. Needs some more options for different access patterns of course. Hybrid approach with probably ai-driven choices of expiration could also be fast for some cases. But there needs to be a quick-access to data with as least compute as possible or at least vectorized if possible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants