Connect baidu qianfan api get 400 error: 400 Message index[2] content should be string #840
Replies: 6 comments
-
https://www.volcengine.com/docs/82379/1302008 The DouBao large model and DeepSeek v3 model from VolcanoArk cannot be used via OpenAI-compatible connections and require API adapter for VolcanoArk's large model service platform. |
Beta Was this translation helpful? Give feedback.
-
The problem seems like, [Bearer Token] is not suitable for Roo-Code, so that we cannot use OpenAI Compatible provider to use these models like Baidu Qianfan ModelBuilder and luchentech cloud. |
Beta Was this translation helpful? Give feedback.
-
I found this problem too. But it can be used in chatbox with OpenAI Compatible provider. |
Beta Was this translation helpful? Give feedback.
-
Is there any solution to that? I use these codes and work well. May be some adapter problems. from openai import OpenAI
client = OpenAI(
api_key="bce-v3/ALTAK-/f69945bc8704fe2cdf23", # 千帆ModelBuilder平台bearer token
base_url="https://qianfan.baidubce.com/v2", # 千帆ModelBuilder平台域名
)
completion = client.chat.completions.create(
model="deepseek-v3", # 预置服务请查看支持的模型列表
messages=[{'role': 'system', 'content': 'You are a helpful assistant.'},
{'role': 'user', 'content': 'Hello!'}]
)
print(completion.choices[0].message) |
Beta Was this translation helpful? Give feedback.
-
Thanks for flagging! Attached to #770 as iiuc these are essentially the same issue (need Volcano adapter.) If these are different provider support requests, please flag and we can decouple. |
Beta Was this translation helpful? Give feedback.
-
@steveukuk @DingZhenPearl I'v made a PR to fix this #891, you can try my branch to see if it works. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Which version of the app are you using?
v3.3.9
Which API Provider are you using?
OpenAI Compatible
Which Model are you using?
deepseek-v3
What happened?
400 Message index[2] content should be string
Steps to reproduce
Relevant API REQUEST output
Additional context
No response
Beta Was this translation helpful? Give feedback.
All reactions