Unlocking live context in prompts and variables #1267
Replies: 2 comments 6 replies
-
I'm pretty sure
Yeah I appreciate this. Coming up with terms for every new feature in CodeCompanion isn't exactly my forte 😆. My thought process for this was that on every "submit", you're sending a new message to your LLM. And you're effectively "pinning" that buffer to every new message rather than pinning the buffer to the chat buffer itself. It was implemented prior to watch and will likely be removed from the plugin soon. Regarding your ask...can you give me a non-trivial example of what data you'd like to be "live"? I appreciate the time example but I can't foresee that making an iota of difference in quality to any output from an LLM and I have never required anything to be "live" other than buffers. |
Beta Was this translation helpful? Give feedback.
-
@einarpersson - FYI, I added a prompt decorator feature, yesterday that you may find useful. It decorates a user prompt prior to sending over to an LLM, writing this into the chat history too. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi! I am so excited about code companion!
I would like to unlock 'live context capability'. What do I mean with that? Well, a lot of contextual information changes over time, it becomes stale. The most obvious example is the time/clock itself. I would like to include such context in the system prompt and have it refreshed on each new message submitted.
I have used this capability in other ai chat tools with great success, it really makes the user experience much smoother. I made a private fork of aichat just for this purpose. In that case it made my shell assistant aware of the cwd, time, etc.. small things that helped the assistant be more accurate as I made sure it had the current information. Doing it this way was also very economical token-wise. The one thing to be aware of is that I had to wrap the live data in a kind of "live data block" which made it clear to the LLM that this data was always the current one and that this would explain any discrepancies with the following conversation. Without that, there could be some confusion. But with that, it worked really well!
I have been looking at the docs for extending prompts and was glad to find that I could provide functions to generate dynamic prompts. However, I was lacking a
live = true
orrefresh = true
that would enable the capability I was talking about above.Suggestion 1
So one concrete suggestion is to add such a property to the message tables. Then on each new message sent the function should re-run and replace that message in the chat conversation with the same index.
Or it should perhaps be for simplicity be added in the
opts
-table on the whole prompt. But that would make it less granular and less performant. And I'm also not sure about what it would mean for prompts with referencesA related note:
I was also fiddling around with
#buffer
and#buffer{pin}
. As noted in the docs, pinning a buffer increases the buffer usage a lot since a whole new copy of the buffer is appended on each roundtrip.When I first saw the pinning feature, I actually thought that it was more inline with the 'live data' concept above. I.e. that it was inserted/appended to the system prompt and that it was replaced/refreshed on each new user message sent. This is for me a more natural meaning of 'pinning'.
(Note: I'm aware of the watch feature and it may be good, but I would ideally want something like above to be possible.)
And to solve a more general problem, when looking in the docs about creating your own variables I would like something similar as described above, a key such as
refresh = true
orlive = true
orreevaluate = true
. It could be injected into the system prompt with a<live-context>
-block or something.Then it would yet again be possible to just add a
#clock
to the chat buffer and from there on the assistant would always have the up-to-date time.Therefore,
Suggestion 2
Add a similar
live = true
(or whatever is the best name) to the variables table.Looking forward to hearing your thoughts on this. Perhaps the time example sounds a bit contrived but I really think this would unlock new capability and increase accuracy and save tokens.
Beta Was this translation helpful? Give feedback.
All reactions