如何配置本地的 openai api ? #5013
-
我在升腾服务器上使用 Mindie 配置了一个本地模型的推理接口,符合 openai api 格式。如何将其配置进入 chatchat 。 注意到在 model_settings.yaml 文件中有一个 cutom openai 类型的平台设置,请问和这个相关吗?如果相关又将如何配置呢? 谢谢你的解答 |
Beta Was this translation helpful? Give feedback.
Answered by
zRzRzRzRzRzRzR
May 11, 2025
Replies: 2 comments
-
可以通过oneapi来配置。 |
Beta Was this translation helpful? Give feedback.
0 replies
-
可以通过One API或者直接修改base url来解决,大部分API模型都兼容OpenAI接口 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
zRzRzRzRzRzRzR
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
可以通过One API或者直接修改base url来解决,大部分API模型都兼容OpenAI接口