Output_schema with LiteLlm #301
-
Im trying to create an Agent with open-ai model, but i didn't manage to also use output_schema.
this is the error I got |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
This is a current limitation of the model and it will be fixed in the future. |
Beta Was this translation helpful? Give feedback.
-
same problem here. and with the anthropic_llm as well |
Beta Was this translation helpful? Give feedback.
-
I'm getting this same problem when specifying |
Beta Was this translation helpful? Give feedback.
-
Is there an ETA on a fix for this? |
Beta Was this translation helpful? Give feedback.
This is a current limitation of the model and it will be fixed in the future.