Loop through all handlers #480
Replies: 2 comments 2 replies
-
Hey, @topolanekmartin! Thank you for your question. CacheHandler doesn't call the CacheHandler's main objective is to assist in developing multi-instance applications that can scale effectively. The team behind Next.js recommends turning off in-memory caching in such setups. This is a valid suggestion because when each application instance has its local cache, the data can eventually diverge between the different instances. Therefore, for multi-instance applications, it is always advisable to prioritize using a shared cache. Local caching must be a fallback option when a shared cache is unavailable. CacheHandler is specifically designed to work with cache stores that have TTL (Time To Live) functionality. If the cache store doesn't support this feature, CacheHandler will internally check whether the value has expired. Users can manually set TTL periods via the revalidate option (Pages Router revalidate option), indicating their preference for seeing new data after this period. All prebuilt Handlers respect TTL, and if a value is not found in one cache store, it's highly likely that it's not available in others either. In such cases, Next.js will render the new version and pass it to the CacheHandler. Users can revalidate the cache anytime by using the revalidate function (Pages Router revalidate function). In multi-instance setups, they can revalidate a shared cache value by calling the function once on any of the instances. However, if they only have a local cache, they need to call each instance's revalidate function. With all these said, there is no reason to place the local cache in front of the shared one in the multi-instance setups. Please explain why you want to use the local cache before the shared one. |
Beta Was this translation helpful? Give feedback.
-
This makes a bit of sense. However, I'd suggest trying to explain this more carefully, since it's different than how many other multi-level caches work. For example, the CPU memory cache will load data from memory into L{1,2,3} caches closer to the CPU to improve performance. Yes, this requires maintaining cache consistency, which is one of the harder problems in computer science. And I can understand why you don't want to take on this challenge. But, again, the documentation should be improved. I guess it's also unclear to me why you'd ever use multiple cache implementations given this limitation. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Currently the get methods are called and the first value found is returned, even if it is null or undefined. You only go to the next handler when an exception is thrown. What if I wanted to have custom handlers in this order - local inmemory cache and then redis?
Unfortunately, the data in redis is ignored, and due to the fact that it is missing in the in-memory memory, null is returned and thus the set method is started. I would like the get methods to be called first from all handlers and only after nothing is found or an exception is thrown, the set function would be called.
Beta Was this translation helpful? Give feedback.
All reactions