-
-
Notifications
You must be signed in to change notification settings - Fork 844
Introduce a function for caching static content #4900
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Overall, this is a much more cleaner and concise implementation of the original However, it shares the same issue: Should the element have a pipeline to convert the input, it is likely no longer a Case in point: ui.markdown(cache('''Cached **Markdown**'''))
Overall, I am assessing which way to go: #4796 defines one and only one caching strategy per element based on the common-sense (which input is likely going to be the biggest, and one which if you want a different one you'll create a separate element).
#4900 allows the flexible definition of cache strategy, which is on the responsibility to call
I think #4900 is worth it over #4796 since it is easier to use, despite not having one easy-to-call Check if you agree. If so, we can proceed with this PR more seriously. Thanks! P.S.: It is clean since you leverage OOP, something which I should learn more. |
@falkoschindler 3 lines of change for making ![]() I think I like this over #4796 since this is more Pythonic. Even Pydantic got its |
|
I think so, yes. I'd rather start with an easy solution and see if we can come up with a
This raises an important point: Elements don't automatically support cached input if, e.g., strings are transformed before being put into the props dictionary. I'll need to check how many elements are affected.
Yes, this might be a solution. Alternatively the auto-index client could store hashes for individual connections, but this might require too much memory for long-running apps.
I'm not sure if this is a relevant use case since class strings are rather short. I don't think replacing them with "CACHE_<32-digit-hash-id>" is worth the effort. |
The original motivation for caching arose after introducing the tree based documentation hierarchy in #4732. By using |
I don't think the two implementations are mutually exclusive. More, they can complement each other in select use cases. However, you can say it does make implementing this PR less urgent. |
True. Even if you have a SPA, you may want to speed up the page load when revisiting the SPA. |
Motivation
Inspired by PR #4796, this PR uses a simpler approach by caching individual strings, lists and dictionaries rather than whole elements. At the moment it is just a proof of concept showing the general idea. It can be tested as follows:
A temporary
console.log
shows cache misses on the JavaScript console. This should only happen when visiting the app for the first time or when altering content which is wrapped in acache()
call.Implementation
The user wraps static content with the new
cache()
function, which convertsstr
toCachedStr
,list
toCachedList
anddict
toCachedDict
. This way Python can continue interacting with these objects like before, but props can later be augmented with cache information before sending them to the client. This is done in form of a string "CACHE_<hash_id><json>". In case the server knows that a specific hash ID is known to the client, it omits the "<json>" part, saving payload.The client "unhashes" all props that start with "CACHE_". Unknown hashes are automatically added to a dictionary "__nicegui_data_store__" in
localStorage
and to a cookie "__nicegui_hash_keys__". When reloading the page, the server can use the information from this cookie to initialize its set of known hashes.Overall this implementation is rather simple. Let's discuss if this feels nice or if it is missing anything important.
Progress