Replies: 1 comment 1 reply
-
@mrddter On the first 2 evaluations you did in this example, When functions are provided with a prompt, in Functionary models, the beginning of the context includes a description of these functions and how to use them, so To simplify this, imagine that the context looks like this: let contextState = "";
// After 1st question:
// We append to the current state to achieve this state: "[system prompt][user prompt 1]"
contextState += "[system prompt][user prompt 1]";
// After 2nd question:
// We append to the current state to achieve this state: "[system prompt][user prompt 1][model answer 1][user prompt 2]"
// The model already added "[model answer 1]" as part of the generation of an answer to the previous prompt
contextState += "[user prompt 2]";
// After 3rd question:
// We have to erase most of the current state to achieve this state: "[system prompt][available functions description][user prompt 1][model answer 1][user prompt 2][model answer 2][user prompt 3]"
contextState = contextState.slice(0, "[system prompt]".length) + "[available functions description][user prompt 1][model answer 1][user prompt 2][model answer 2][user prompt 3]"; The evaluation for the 3rd question is expensive because an earlier part of the context has changed, it now has to evaluate a very long "text" and it'll take some time. I plan to add support for more models, and there may be models that don't have to be given the list of available functions at the beginning of the context, which will allow for an optimized use of an existing context state. As a workaround for now, you can provide all the functions on all of the prompts so existing context state can be efficiently utilized. To investigate what the current context state looks like, you can do |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
First of all, thank you for this work. It's amazing 💕
I'm trying/playing with the beta version
3.0.0-beta.9
and (maybe it's my fault) I noticed that the responses become slow as soon as functions are added. This even if there is a simple function, I report the case below:And this is the output:
In the first scenario, when it doesn't call tavily, it responds in less than 4 seconds. Instead, when it invokes the function, the time increases to 30 seconds (tavily takes almost 4.5 seconds).
I also tried with a simple function that returns a static string, and even in that case, the time increases, albeit by a few seconds (2-3, from memory).
Is there a way to speed it up? Any suggestions?
Beta Was this translation helpful? Give feedback.
All reactions