Replies: 1 comment
-
No answer. Will open issue. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Windows 10 (not WSL)
I'm not entirely sure where the problem lies, but I haven't been able to get llamafiles to work with Firefox. It worked before.
A handful of separate commands I've tried running it with - all have the same outcome:
llamafile
Firefox
More info
going directly to http://localhost:8080/ or http://127.0.0.1:8080/ in a browser tab and typing works, but not when using the firefox built-in ML stuff like 'Summarize'.
Setting
browser.ml.chat.sidebar
to "False" and using firefox ml feature from context menu will open a tab with a url likehttp://localhost:8080/?q=I’m+on+page+“Test+Webpage”+with+“foobar”+selected.%0A%0APlease+summarize+the+selection+using+precise+and+concise+language.+Use+headers+and+bulleted+lists+in+the+summary%2C+to+make+it+scannable.+Maintain+the+meaning+and+factual+accuracy.
and I can see firefox talk to llamafile server via the commandprompt window, but there won't be a response back from the llamafile server. Even typing in the chatbox and submitting it doesn't yield a response.In Dev Tools, I see a 404 in the POST:

Beta Was this translation helpful? Give feedback.
All reactions