-
-
Notifications
You must be signed in to change notification settings - Fork 210
Open
Description
Describe the bug
When using OpenRouter for FIM, nothing happens. Looking at the logs (below), my theory is the model field in the request body is missing.
To Reproduce
Create an OpenRouter FIM provider with:
- Type: fim
- FIM Template: automatic
- Provider: openrouter
- Protocol: https
- Model Name: qwen/qwen3-coder
- Hostname: openrouter.ai
- API Path: /api/v1/completions
- API Key:
Expected behavior
FIM to work
Logging
[INFO] ***Twinny Stream Debug***
Streaming response from openrouter.ai.
Request body:
{
"prompt": "<PRE> \n\n# Language: Perl (perl) \n# File uri: untitled:Untitled-1 (perl) \nStart of my code <SUF> \nEnd of my code <MID>",
"stream": true,
"temperature": 0.2,
"n_predict": 512
}
Request options:
{
"hostname": "openrouter.ai",
"path": "/api/v1/completions",
"protocol": "https",
"method": "POST",
"headers": {
"Content-Type": "application/json",
"Authorization": "Bearer <redacted>"
}
}
Number characters in all messages = 0
[ERROR_error] Fetch error
Error Type: Error
Error Message: Server responded with status code: 400
API Provider
OpenRouter
Chat or Auto Complete?
It is an issue with FIM. Using the same model and OpenRouter works fine for chatting.
Model Name
qwen/qwen3-coder (but it will be the same with other models)
Desktop:
- OS: Linux/Windows (tried on both)
- Browser: Firefox
- Version: 142
Additional context
As per the OpenRouter API documentation, the model field is required: https://openrouter.ai/docs/api-reference/completion
Metadata
Metadata
Assignees
Labels
No labels