-
-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Closed
Description
Mon appel direct à OpenRouter fonctionne
-Body '{ "model": "mistralai/mistral-7b-instruct", "messages": [ { "role": "user", "content": "Bonjour, qui es-tu ?" } ] }'
Via LiteLLM, j'obtiens "Expected object, received string" sur tous les modèles OpenRouter
J'utilises la version 1.72.1
log complet de LiteLLM (LiteLLM: Proxy initialized with Config, Set models:
gpt-3.5-turbo
gpt-4-turbo
claude-3-opus
claude-3-sonnet
claude-3-haiku
deepseek-chat
deepseek-coder
mistral-7b-instruct
openrouter/mistralai/mistral-7b-instruct
mixtral-8x7b
command-r-plus
gemma-7b-it
qwen-14b-chat
qwen-7b-chat
21:12:11 - LiteLLM Router:INFO: router.py:652 - Routing strategy: simple-shuffle
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:4000 (Press CTRL+C to quit)
21:12:31 - LiteLLM Proxy:INFO: parallel_request_limiter.py:68 - Current Usage of key in this minute: None
21:12:31 - LiteLLM:INFO: utils.py:2827 -
LiteLLM completion() model= mistralai/mistral-7b-instruct; provider = openrouter
21:12:31 - LiteLLM Router:INFO: router.py:1095 - litellm.acompletion(model=openrouter/mistralai/mistral-7b-instruct) Exception litellm.BadRequestError: OpenrouterException - {"error":{"message":"Expected object, received string","code":400},"user_id":"user_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"} LiteLLM Retried: 2 times
21:12:31 - LiteLLM Router:INFO: router.py:3557 - Retrying request with num_retries: 2
21:12:32 - LiteLLM:INFO: utils.py:2827 -
LiteLLM completion() model= mistralai/mistral-7b-instruct; provider = openrouter
21:12:32 - LiteLLM Router:INFO: router.py:1095 - litellm.acompletion(model=openrouter/mistralai/mistral-7b-instruct) Exception litellm.BadRequestError: OpenrouterException - {"error":{"message":"Expected object, received string","code":400},"user_id":"user_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"} LiteLLM Retried: 2 times
21:12:32 - LiteLLM:INFO: utils.py:2827 -
LiteLLM completion() model= mistralai/mistral-7b-instruct; provider = openrouter
21:12:32 - LiteLLM Router:INFO: router.py:1095 - litellm.acompletion(model=openrouter/mistralai/mistral-7b-instruct) Exception litellm.BadRequestError: OpenrouterException - {"error":{"message":"Expected object, received string","code":400},"user_id":"user_2xsRUj8ffkIO9OFTdMo8e3RMW8w"} LiteLLM Retried: 2 times
21:12:32 - LiteLLM Router:INFO: router.py:3266 - Trying to fallback b/w models
21:12:32 - LiteLLM Proxy:ERROR: common_request_processing.py:344 - litellm.proxy.proxy_server._handle_llm_api_exception(): Exception occured - litellm.BadRequestError: OpenrouterException - {"error":{"message":"Expected object, received string","code":400},"user_id":"user_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"}. Received Model Group=mistral-7b-instruct
Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
Traceback (most recent call last):
File "C:\LiteLLM\venv\Lib\site-packages\litellm\llms\custom_httpx\llm_http_handler.py", line 73, in _make_common_async_call
response = await async_httpx_client.post(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\LiteLLM\venv\Lib\site-packages\litellm\litellm_core_utils\logging_utils.py", line 135, in async_wrapper
result = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\LiteLLM\venv\Lib\site-packages\litellm\llms\custom_httpx\http_handler.py", line 256, in post
raise e
File "C:\LiteLLM\venv\Lib\site-packages\litellm\llms\custom_httpx\http_handler.py", line 212, in post
response.raise_for_status()
File "C:\LiteLLM\venv\Lib\site-packages\httpx_models.py", line 829, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://openrouter.ai/api/v1/chat/completions'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400)
C’est bloquant pour tous les modèles OpenRouter
Metadata
Metadata
Assignees
Labels
No labels