Skip to content

Configuration of librechat.yaml for Azure OpenAI #7364

Discussion options

You must be logged in to vote

Thanks for the report but this is not a bug—it's expected behavior.

  • In your config, the key under models: (the "model identifier") must match an actual LLM model name or identifier (like gpt-4.1).
  • If you use a random key (like foo:), LibreChat can't recognize the model, resulting in errors.
  • This matches what's stated in the docs:

    The model identifier must match its corresponding OpenAI model name.

Fix:
Change your model config to:

models:
  gpt-4.1:
    deploymentName: "gpt-4.1"

Arbitrary keys are not meant to be used under models:. The motivation behind this is to allow cases where the deployment name is not a valid LLM model name or identifier (like gpt-4.1).

Do you have a specific…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@apamildner
Comment options

Answer selected by apamildner
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #7363 on May 13, 2025 21:11.