C program for interacting with Ollama server from a Linux terminal. It is not meant to be a complete library at all. At the moment, it's just the simplest interface that I'm developing solely for my personal and daily usage.
- 0.0.1
- supports only SSL/TLS (sorry, security by design, baby)
- libssl.so.3
- libcrypto.so.3
- libc.so.6
sudo apt-get install libssl-dev libcrypt-dev
git clone https://github.com/lucho-a/Ollama-C-lient.git
cd Ollama-C-lient/src/
gcc -o ollama-c-lient Ollama-C-lient.c lib/* -lssl -lcrypto
- $ ollama-c-lient [options]
The options supported are:
- The sent messages & the responses are written into the context file if '--context-file' is specified.
- If '--context-file' is specified and it doesn't exit, it will be created if possible.
- If '--context-file' is specified, the latest X messages/responses (parameter '--max-msgs-ctx') are read when the program starts as context.
- So, if '--max-msgs-ctx' > 0, and '--context-file' is not set up, the program will start without any context. Nevertheless, as long as chats succeed, they will be stored in RAM and taken into account in the successive interactions. (1)
- If '--max-msgs-ctx' == 0, the interactions won't be recorded into context file.
- If '--stdout-chunked', '--response-speed' is deprecated.
- the font format in '--font-...' must be ANSI ("XX;XX;XX").
- If the entered prompt finish with ';', the query/response won't take into account the current context ('--max-msgs-ctx') and won't be written to the context file,
- If the entered prompt finish with ';', the query/response won't be part of subsequent context messages. (1)
- Crl-C cancel the responses.
Just using a script like:
#!/bin/bash
OCL="/home/user/ollama-c-lient/ollama-c-lient
--server-addr 192.168.1.234
--server-port 443
--model mistral
--temperature 0.6
--response-speed 15000
--max-msgs-ctx 5
--max-msgs-tokens 8196
--keep-alive 1800
--context-file /home/user/ollama-c-lient/chat.context
--system-role-file /home/user/ollama-c-lient/chat.role
--color-font-response '0;0;90'
--color-font-system '0;0;37'
--color-font-info '0;2;33'
--color-font-error '0;0;31'
--stdout-parsed"
echo
while true; do
read -e -p "-> " input
if [[ $input == "" ]]; then
break
fi
echo "$input" | $OCL
history -s $input
done
history -cw
echo
exit 0
(echo 'What can you tell me about my storage: ' && df) | ./ollama-c-lient --server-addr 192.168.5.123 --server-port 4433 --model deepseek-r1 --context-file ~/agents/dfAgentContextFile.context --stdout-parsed >> log-file.log
./ollama-c-lient --model deepseek-r1 < prompt.txt
(echo 'What can you tell me about the content of this file?: ' && cat /home/user/file.txt) | ./ollama-c-lient --server-addr 192.168.5.123 --server-port 4433 --model deepseek-r1 --stdout-parsed < prompt.txt
(echo -n 'At ' && date +"%Y-%m-%d %H:%M:%S" && echo 'What can you tell me about current processes?: ' && ps ax) | ./ollama-c-lient --server-addr 192.168.1.2 --server-port 443 --model mistral --stdout-parsed --stdout-chunked | grep 'Ollama-C-lient'
(echo 'Tell me a prompt about whatever: ' && cat whatever.txt) | ./ollama-c-lient --server-addr 192.168.1.2 --server-port 443 --model mistral | ./ollama-c-lient --server-addr 192.168.1.2 --server-port 443 --model deepseek-r1:14b --stdout-parsed
(echo 'What can you tell me about about this paint? ') | ./ollama-c-lient --model gemma3:12b --stdout-parsed --response-speed 15000 --color-font-response "0;0;90" --image-file ~/paints/van-gogh.jpg