A lightweight chat server that leverages Ollama (LLM API) to generate intelligent, conversational responses. Designed for easy local deployment and multi-client support, this project is ideal for experimenting with LLM chatbots or building custom chat applications.
- Multi-client Chat Server: Supports multiple concurrent clients via TCP sockets.
- LLM-Powered Responses: Integrates with Ollama for AI-generated replies.
- Customizable System Prompt: Define the assistant's personality and behavior.
- Simple Python Clients: Two ready-to-use client scripts for easy testing.
- Python 3.10+
- miniconda
- Ollama
-
Download and install MiniConda from here
-
Create a new environment using the following command:
conda create --name ollama_chat_server python=3.10 -y
- Activate the environment:
conda activate ollama_chat_server
- Install dependencies:
pip install -r requirements.txt
- Clone the repository:
git clone https://github.com/24-mohamedyehia/Ollama-Chat-Server.git
- Download and install Ollama from here.
-
Start your Ollama server (ensure the model specified in
src/base_server.py
is available). -
Run the chat server:
python src/base_server.py
Open a new terminal and run either client:
python src/client_one.py
# or
python src/client_two.py
- Type your message and press Enter.
- Type
exit
to end the session. - The server will respond with an AI-generated reply, always in English, concise, and with emojis.
- Model: Change the
model_llm
variable insrc/base_server.py
to use a different Ollama model. - System Prompt: Customize the
system_message
variable to alter the assistant's behavior. - Port: The server listens on
localhost:12345
by default. Edit instart_server()
if needed. - Environment: Use the
.env
or.env.example
files if you want to manage environment variables (not strictly required for default setup).
- Python (socket, threading, requests)
- Ollama (LLM server)
- colorama (for colored client output)
This project is licensed under MIT License - see the LICENSE file in this repository.
- Ollama for the LLM API
- Developed by Mohamed Yehia