server unknown argument: --mmproj #5967
Unanswered
TemporalAgent7
asked this question in
Q&A
Replies: 1 comment 2 replies
-
Hi, multimodal support has been temporarily dropped in #5882 |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm brand new to llama.cpp; just did a build on Windows (from the top of the master branch) and trying to run server.exe I'm seeing this output:
The model and projection loads fine if I use llava-cli from the same local build.
Are there additional steps (configuration options, or different branches) needed to have multimodal support enabled in server?
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions