Skip to content

Chat not being sent to Ollama #142

@EngineersNeedArt

Description

@EngineersNeedArt

Bug: I have entered in the IP address for my Ollama instance running on a server on my LAN. Ollamac confirms reachability and gets an nonempty array of models. I enter text for a chat, hit ENTER, but nothing happens. Nothing sent to the server.

I've traced the bug down to ChatView.generateAction()

The first guard tests !activeChat.model.isEmpty and it is empty.

This should not be the case. ChatViewModel.fetchModels() succeeded earlier to return a single model (phi4:latest, FWIW).

(Commenting out !activeChat.model.isEmpty allows the app to continue but it fails later — I assume again because there is an empty string rather than a mode.)

I'm on a MacBook Pro, MacOS: 15.3.2

My ollama binary is on my LAN — on a Linux server. I have tested BoltAI with the IP address/port of my ollama server instance and it works correctly. Therefore I think it is a problem with Ollamac.

I see some asserts in console when running from Xcode:

•Can't find or decode reasons
•Failed to get or decode unavailable reasons
•Picker: the selection "" is invalid and does not have an associated tag, this will give undefined results.
• ViewBridge to RemoteViewService Terminated: Error Domain=com.apple.ViewBridge Code=18 "(null)" UserInfo={com.apple.ViewBridge.error.hint=this process disconnected remote view controller -- benign unless unexpected, com.apple.ViewBridge.error.description=NSViewBridgeErrorCanceled}

The above may be harmless — or if they are a true exception may be causing code that follows not to execute.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions