Skip to content

ArtunK/Chat-with-Deepseek-r1-32b-on-Telegram

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 

Repository files navigation

Chat-deepseek-r1:32b with Telegram Bot

This Python script enables users to manage an AI model through both a Telegram bot and a Tkinter-based graphical user interface (GUI). Users can start and stop the model and interact with it via Telegram messages.

Features

Telegram Bot Integration

  • Control the AI model by sending /start_model and /stop_model commands through Telegram.
  • When the model is active, any text message sent to the bot is processed by the AI model, and the response is sent back to the user.

Tkinter GUI

  • A simple interface to start and stop the AI model, displaying its current status.

Prerequisites

  • Python 3.8 or higher: Ensure Python is installed on your system.
  • Telegram Bot Token: Obtain a bot token by creating a new bot through the BotFather on Telegram.
  • Chat ID: Determine your chat ID to receive messages from the bot. You can use tools like getidsbot to find your chat ID.

Installation

Install Required Packages

Use pip to install the necessary Python packages:

pip install ollama python-telegram-bot psutil requests
  • ollama: Interface for the AI model.
  • python-telegram-bot: Library to interact with the Telegram Bot API.
  • psutil: For system and process utilities.
  • requests: To send HTTP requests.

Configure the Script

Open the script and replace the placeholders with your actual BOT_TOKEN and CHAT_ID:

BOT_TOKEN = "your_bot_token_here"
CHAT_ID = "your_chat_id_here"

Usage

Run the Script

Execute the Python script:

python your_script.py

Using the Tkinter GUI

  • Start Button: Click to start the AI model. The status label will display "Model Running" in green.
  • Stop Button: Click to stop the AI model. The status label will display "Model Stopped" in red.

Using the Telegram Bot

  • Start the Model: Send /start_model to the bot.
  • Stop the Model: Send /stop_model to the bot.
  • Interact with the Model: When the model is running, send any text message to the bot, and it will respond with the AI-generated reply.

Notes

  • Ensure that the AI model specified in MODEL_NAME is correctly configured and available.
  • The script includes a function to terminate the ollama_llama_server.exe process. Modify this function if your AI model uses a different process name.
  • Always keep your BOT_TOKEN confidential to prevent unauthorized access to your bot.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages