Skip to content

FIM from OpenWebui seems impossible to make work #498

@mtoniott

Description

@mtoniott

Hey !

Thanks for the project. I really would love to use it, but I have problems making it work on my machine.

I have a server dedicated to AI workloads, where I use llama.cpp to server to open webui the models. I don't want to expose directly llama.cpp endpoint and prefer to go through open webui for security.

I want to make FIM work on twinny on my laptop. Chat model work like a charm with "/api/v1/" api path.

for FIM, I really have no clue what is wrong. VSCode threw me an error 405 when I used /api/v1/ , so I tried different endpoint and the only different result that I get is error 400 when using "/api/v1/chat/completions".

Image

Here is a picture from my setup. What's wrong ?

This issue (#224) seemed to indicate that I should use /ollama/api/generate, which also gives me a nice error 400.

What should I do / where does the problem comes from ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions