-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Open
Labels
Description
What happened?
This works:
import openai
import os
from dotenv import load_dotenv
load_dotenv()
api_key=os.getenv("LITELLM_PROXY_API_KEY"),
api_base=os.getenv("LITELLM_PROXY_API_BASE"),
client = openai.OpenAI(
api_key=api_key,
base_url=api_base
)
model_id = "cohere.embed-multilingual-v3"
response = client.embeddings.create(model=model_id, input="What is this", encoding_format=None)
print(len(response.data[0].embedding))
This doesn't work, even if you set drop_params=true
for this model in the config.yaml (the only difference is the lack of encoding_format=None
)
import openai
import os
from dotenv import load_dotenv
load_dotenv()
api_key=os.getenv("LITELLM_PROXY_API_KEY"),
api_base=os.getenv("LITELLM_PROXY_API_BASE"),
client = openai.OpenAI(
api_key=api_key,
base_url=api_base
)
model_id = "cohere.embed-multilingual-v3"
response = client.embeddings.create(model=model_id, input="What is this")
print(len(response.data[0].embedding))
The error I get is:
openai.BadRequestError: Error code: 400 - {'error': {'message': 'litellm.BadRequestError: BedrockException - {"message":"Malformed input request: #/embedding_types: expected type: JSONArray, found: String, please reformat your input and try again."}. Received Model Group=cohere.embed-multilingual-v3\nAvailable Model Group Fallbacks=None', 'type': None, 'param': None, 'code': '400'}}
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.63.3-nightly
Twitter / LinkedIn details
No response