Replies: 1 comment
-
I am also struggling to host llama.cpp to host on IIS from my windows VM. Separately I tried --host 'my_IP' --port 8080, but I'm getting error like "couldn't bind to server socket: hostname='my_IP' port=8080". Because of this I cannot access llama.cpp outside the VM. Please someone help on this, or am I getting it all wrong? If this problem is already solved please provide that issue link. Thanks in advance. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
When I run the server with the command
./server --host 0.0.0.0
on macOS Sonoma 14.3, I can't access the server from another computer on the same network. It works on another Mac running macOS 12.7.On the same computer running Sonoma, I can access it by opening either
localhost:8080
or192.168.1.101:8080
.I created a test app in Python using Flask, and I was able to access it from another computer. When the firewall is enabled, my Flask app triggers a macOS dialog that asks whether I want to allow Python to listen for incoming connections. Strangely, running
llama.cpp/server
does not trigger this dialog.I tried turning off the firewall, but it didn't work either.
I would greatly appreciate any insights or suggestions!
Beta Was this translation helpful? Give feedback.
All reactions