Client side chat memory? #1635
patriot1burke
started this conversation in
Ideas
Replies: 1 comment 9 replies
-
👋🏽
I must be missing something, but isn't this what the client currently does? |
Beta Was this translation helpful? Give feedback.
9 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Chat memory greatly bothers me. Holding session state in server memory does not scale well and you have to have a clustered cache and/or sticky sessions with the client.
I thought: Why not serialize chat memory and send it back to the client and have the client resend it with each chat message?
Sound good? Here's an initial implementation:
@RequestScoped
bean that is injected into the application and filled by the application, and then serialized when the user message has been processed.Interested in getting a PR for this once I flush out the details?
I've also have had HUGE problems with getting the AI to format complex responses correctly, so instead I've been piggy-backing data with the response. Eventually, I'm going to have a client Response Context object that contains the current ChatMemory and a set of UI events that is calculated by a tool and attached the the context returned to the client. The client will then output the chat response from the AI and loop through any piggy-backed UI events to render relevant information.
Beta Was this translation helpful? Give feedback.
All reactions