Replies: 1 comment
-
Hello @ly000 ! Yes! Its a bit tricky Azure: https://docs.litellm.ai/docs/sdk_custom_pricing Remember, ExtractThinker has 2 drivers in the LLM component, LilteLLM by default, and pydantic AI. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everbody,
im struggling to initialize the demo project using a llm proxy, which has access to a few options which are supported my ExtractThinker. In my mind the setup should be equal to setting up the "Local Demo" - Setting the API_BASE env variable and the Key and choosing one of the models provided by my proxy.
e.g.
What am i doing wrong, or is this not supported?
Thanks for the help
Beta Was this translation helpful? Give feedback.
All reactions