Skip to content

uhstray-io/WisBot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

WisBot

A Multi-functional Discord Automation Bot

WisBot is a versatile automation bot that combines Discord integration, AI capabilities, file management, and a web interface into one powerful solution.

Key Features:

  • Discord Integration: Seamless connection with Discord API for command handling and message processing
  • AI Capabilities: Integration with Ollama to provide LLM functionality for natural language interactions
  • File Management: Robust file upload/download system with database storage, user tracking, and automatic cleanup
  • Web Interface: User-friendly web interface built with Go templates for interaction outside Discord
  • Observability: Comprehensive monitoring and tracing with OpenTelemetry integration

Getting Started

Quick Start Guide

  1. Set up your environment variables (see Running the bot)
  2. Deploy using Docker or Go (see deployment options)
  3. Interact with WisBot using commands

Tiltfile Quickstart

The Tiltfile in this repository is designed to streamline the development process for WisBot. It automates the setup of the development environment, including:

  1. Uses Docker Compose to orchestrate the services defined in compose.yaml
  2. Sets up code generation for Templ and SQLC, watching for changes in template and SQL files
  3. Compiles the Go application with proper settings for Linux targets
  4. Configures Docker build with live updates to sync code changes without full rebuilds
  5. Sets resource dependencies to ensure proper build order
  6. Organizes resources with labels for better UI grouping
  7. Configures port forwarding for the database dashboard

With this configuration, developers can run tilt up to start WisBot with live reload capabilities, making the development workflow much more efficient.

Architecture

WisBot Design Architecture

flowchart TD
subgraph subgraph1["Users"]
    User["Discord Members"]
    WebUser["Web Users"]
end
subgraph subgraph7["Business Users"]
    Employees
    Developers
end
subgraph subGraph5["Connected Services"]
    NocoDB["NocoDB Project Management"]
    Postiz["Postiz Social Media Scheduler"]
    Superset["Superset Data Analytics"]
end
subgraph WisBot["WisBot"]
    Discord["Discord API"]
    WebSite["WebSite"]
    DiscordBot["Discord Bot"]
    FileManager["File Manager"]
    WisLLM["WisLLM Processor"]
    WebServerAPI["Web Server REST API"]
end
subgraph subGraph3["LLM PLatform"]
    Models[("Model Storage")]
    Ollama["Ollama LLM Service"]
end
subgraph DataBases["Data Warehousing"]
    subgraph subGraph4["Data Cache and Pipelines"]
        Redis["Redis Cache"]
    end
        DataWarehouse[("Data Warehouse PostgreSQL DB")]
        WisbotDB[("Wisbot PostgreSQL DB")]
        bizS3["minIO S3 Object Storage"]
        Drives["Documents/Drives - Proton, Sharepoint, Google"]
        VectorDB[("pgVector Database")]
end





    Ollama -- "llama3.2" --> Models
    User --> Discord
    Employees --> NocoDB & Postiz
    Developers --> Superset
    WebUser --> WebSite
    WebSite --> WebServerAPI
    Discord --> DiscordBot
    DiscordBot --> WisLLM & FileManager
    WebServerAPI --> WisLLM & FileManager
    FileManager --> WisbotDB
    WisbotDB <--> DataWarehouse
    DataWarehouse <--> VectorDB
    VectorDB <--> Drives
    WisLLM <--> WisbotDB
    bizS3 <--> VectorDB
    Postiz --> WebServerAPI
    WisLLM <--> Ollama
    Postiz --> Redis --> bizS3
    NocoDB --> bizS3 & DataWarehouse & WebServerAPI
    Superset --> DataWarehouse & bizS3
Loading

User Experience Workflows

User Experience Diagrams

Commands

Command Description
/wis help Displays available commands and usage information
/wis upload Uploads a file to the server for processing or storage
/wis llm <text> Sends a request to the integrated LLM and returns its response
/wis stats Displays system statistics and performance metrics

Core Beliefs

WisBot adheres to these core principles:

  • Reliability

    • May never fail silently
    • Must always trace errors with complete stack information
    • Must return meaningful errors to users
    • Must fail gracefully when external systems are unavailable
  • Performance

    • Must be fast and responsive
    • Must handle resources efficiently
    • Must enforce user quotas and limits
    • Must cleanup expired resources automatically
  • Architecture

    • Must maintain separation of concerns
    • Must be containerized and deployable
    • Must be observable and monitorable
    • Must scale when needed
  • User Experience

    • Must communicate clearly with users
    • Must keep data secure

Requirements & Dependencies

Runtime Requirements

  • Discord Token
  • PostgreSQL database
  • Ollama (optional, for LLM functionality)
  • Llama3.2 model if using Ollama (ollama pull llama3.2)
  • Nvidia GPU with Container Toolkit (optional, for improved LLM performance)
  • Docker and Docker Compose (recommended deployment method)

Development Requirements

  • Golang 1.24
  • Git
  • The following development tools:

Templ, SQLc & Air

go install github.com/a-h/templ/cmd/templ@latest && go install github.com/air-verse/air@latest && go install github.com/sqlc-dev/sqlc/cmd/sqlc@latest

go get -tool github.com/a-h/templ/cmd/templ@latest && go get -tool github.com/air-verse/air@latest && go get -tool github.com/sqlc-dev/sqlc/cmd/sqlc@latest

Run the following command to generate the Templ files for the bot:

templ generate

Run the following command to generate the SQLC files for the bot:

sqlc generate -f ./src/sql/sqlc.yaml

Then run air in the WisBot repository to update all required dependencies:

air

Running the bot

Note

You will need a .env file with the following environment variables:

# WISBOT ENV VARIABLES
SERVER_IP=wisbot.yourdomain.com
SERVER_PORT=8080

# WISBOT CONFIG
MAX_FILES_PER_USER=3
DELETE_FILES_AFTER_DAYS=7
MAX_FILE_SIZE=250 # in MB
DATABASE_URL=postgres://username:password@localhost:5432/database_name

# OLLAMA ENV VARIABLES
OLLAMA_URL=http://10.5.0.3:11434
OLLAMA_MODEL=llama3.2
OLLAMA_KEEP_ALIVE=24h

# POSTGRES ENV VARIABLES
POSTGRES_USER=username
POSTGRES_PASSWORD=password
POSTGRES_DB=database_name
POSTGRES_PORT=5432

# DISCORD TOKEN
DISCORD_TOKEN_WISBOT=your_discord_token

Running the bot using Go

You can run the bot using the following command:

go run ./src

Prepare Linux (Ubuntu 22.04) for running Wisbot

Remove any existing docker packages and conflicting dependencies:

for pkg in docker.io docker-doc docker-compose docker-compose-v2 podman-docker containerd runc; do sudo apt-get remove $pkg; done

Setup docker-apt repository:

# Add Docker's official GPG key:
sudo apt-get update
sudo apt-get install ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc

# Add the repository to Apt sources:
echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
  $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
  sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update

Install docker:

sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin

Add the user to the docker group:

sudo usermod -aG docker $USER

Install Github Runner on Ubuntu 22.04

https://github.com/uhstray-io/WisBot/settings/actions/runners/new?arch=x64&os=linux

mkdir actions-runner && cd actions-runner

Download the latest runner package:

curl -o actions-runner-linux-x64-2.320.0.tar.gz -L https://github.com/actions/runner/releases/download/v2.320.0/actions-runner-linux-x64-2.320.0.tar.gz

Validate the Runner:

echo "93ac1b7ce743ee85b5d386f5c1787385ef07b3d7c728ff66ce0d3813d5f46900  actions-runner-linux-x64-2.320.0.tar.gz" | shasum -a 256 -c

Extract the runner:

tar xzf ./actions-runner-linux-x64-2.320.0.tar.gz

Configure the runner:

./config.sh --url <MY_URL> --token <MY_TOKEN>

Test the runner:

./run.sh

Setup the github runner as a service:

https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/configuring-the-self-hosted-runner-application-as-a-service

sudo ./svc.sh install

Start the github runner service:

sudo ./svc.sh start

Restart docker and enable the BuildKit to ensure compatibility with the deployment:

DOCKER_BUILDKIT=1
sudo systemctl restart docker

You can use the stop and uninstall commands to stop and uninstall the service.

Setup Nvidia Drivers and Container Toolkit on Ubuntu 22.04

https://developer.nvidia.com/datacenter-driver-downloads?target_os=Linux&target_arch=x86_64&Distribution=Ubuntu&target_version=22.04&target_type=deb_network

Setup the Nvidia drivers GPG key:

wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb
sudo dpkg -i cuda-keyring_1.1-1_all.deb
sudo apt-get update

Install the OpenKernel Drivers:

sudo apt-get install -y nvidia-open-565

Add the container toolkit repository:

curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey | sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg \
  && curl -s -L https://nvidia.github.io/libnvidia-container/stable/deb/nvidia-container-toolkit.list | \
    sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' | \
    sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list

Configure Experimental Packages:

sudo sed -i -e '/experimental/ s/^#//g' /etc/apt/sources.list.d/nvidia-container-toolkit.list

Update the repository and install the Nvidia Container Toolkit:

sudo apt-get update && sudo apt-get install -y nvidia-container-toolkit

Configure the toolkit to use Docker:

sudo nvidia-ctk runtime configure --runtime=docker

Restart Docker:

sudo systemctl restart docker

Using Docker to build and deploy Wisbot

Building the Docker Image

Update the latest build of the wisbot:

docker build -t wisbot .

Running the dockerfile via Docker:

docker run

docker run -d wisbot

Running the bot using docker-compose

docker compose

docker compose up -d
docker compose down

Quick scripts for starting, stopping, and restarting the application:

# Start
./start.sh

# Stop
./stop.sh  

# Restart (down, rebuild, up)
./restart.sh

Running the dockerfile with GPU acceleration enabled:

Enable Nvidia Container Toolkit resources on Ubuntu 22.04 WSL:

nvidia container toolkit

curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey | sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg && curl -s -L https://nvidia.github.io/libnvidia-container/stable/deb/nvidia-container-toolkit.list | sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' | sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list

Update the package list and install the Nvidia Container Toolkit:

sudo apt-get update && sudo apt-get install -y nvidia-container-toolkit

Configure Docker to use the nvidia-container-toolkit:

sudo nvidia-ctk runtime configure --runtime=docker && sudo systemctl restart docker

Run the image with GPU acceleration:

docker run -d wisbot --gpus all ubuntu nvidia-smi

https://docs.docker.com/desktop/gpu/

Ollama Docker Image Documentation

https://hub.docker.com/r/ollama/ollama

Environment Variables

WisBot uses the following environment variables for Ollama:

  • OLLAMA_URL: The URL where the Ollama service is running (e.g., http://10.5.0.3:11434)
  • OLLAMA_MODEL: The model to use (e.g., llama3.2)
  • OLLAMA_KEEP_ALIVE: How long to keep the model loaded (e.g., 24h)

Web Interface

The compose file includes configuration for an optional Ollama web UI, which is currently commented out. Uncomment the ollama-webui service in the compose.yaml file if you want to use it.

Issues

CRLF vs LF

git config core.eol lf
git config core.autocrlf input

About

Wisward Agent Bot integrating business-as-code platform data, services, and search.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •