Skip to content

[Question]:Embeddings and LLM compatibility in server #1864

@DavidGOrtega

Description

@DavidGOrtega

Do you need to ask a question?

  • I have searched the existing question and discussions and this question is not already answered.
  • I believe this is a legitimate question, not just a bug or feature request.

Your Question

Despite that the project is states that its compatible with other Embeddings and LLMs I can not make it work in server.
I have tried LLMs and embeddings that I regularly use with the openAI library.

EMBEDDING_BINDING=openai
EMBEDDING_MODEL=voyage-3-large
EMBEDDING_DIM=1024
EMBEDDING_BINDING_HOST=https://api.voyageai.com/v1/embeddings
EMBEDDING_BINDING_API_KEY=XXXX

LLM_BINDING=openai
LLM_MODEL=gemini-2.5-flash
LLM_BINDING_HOST=https://generativelanguage.googleapis.com/v1beta/openai/
LLM_BINDING_API_KEY=XXXX

I cant not make them work. I will look into the code but embeddings give me a 404 (it works via Curl)

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions