xAI keeps using OpenAI endpoint? #9283
Unanswered
BarryBahrami
asked this question in
Q&A
Replies: 1 comment 1 reply
-
I went into the model and updated the api base to https://api.x.ai/v1 and now it works. thank you |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
It seems no matter how hard I try, the xAI models are trying to use openAI endpoint and so it fails. any ideas? thank you
litellm.AuthenticationError: AuthenticationError: XaiException - Incorrect API key provided: xai-5l0U************************************************************************5vlC. You can find your API key at https://platform.openai.com/account/api-keys.
stack trace: Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py", line 794, in acompletion
headers, response = await self.make_openai_chat_completion_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<4 lines>...
)
^
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/logging_utils.py", line 131, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py", line 438, in make_openai_chat_completion_request
raise e
File "/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py", line 420, in make_openai_chat_completion_request
await openai_aclient.chat.completions.with_raw_response.create(
**data, timeout=timeout
)
File "/usr/lib/python3.13/site-packages/openai/_legacy_response.py", line 381, in wrapped
return cast(LegacyAPIResponse[R],
Beta Was this translation helpful? Give feedback.
All reactions