Replies: 2 comments 3 replies
-
hey @phasmax how would you want to specify this? |
Beta Was this translation helpful? Give feedback.
1 reply
-
@krrishdholakia Hi, I have maybe miss the doc information, but can we also passe the request message from a client like "openwebui" to the "prompt_variables" setting dynamically ? I didn't find a way to do that with litellm proxy. Also if we could route the result to another LLM model before sending the response to the client. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Langfuse prompt management (prompt retrieval by ID) works well. The only thing missing is the ability to also specify the prompt_label - ie in langfuse you can have "production", "stage" labels for versions of a prompt.
Beta Was this translation helpful? Give feedback.
All reactions