Skip to content

AI - a simple commandline local/remote LLM chat client. Supports any local or remote OpenAI-compatible API endpoint (GPT4o, Gemini, Grok, OpenRouter, Ollama, LM Studio and more), model and system prompt management, multiple conversations support, automatic topic identification and stdin piping (sending files to LLM for inspection)

Notifications You must be signed in to change notification settings

nitefood/ai-bash

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 

Repository files navigation

AI - a commandline LLM chat client in BASH with conversation/completion and image generation support

Features:

  • Supports any OpenAI-compatible API endpoint, both local and remote

  • Interactive chat sessions

  • Multiline input

  • Image generation support: (currently limited to DALL·E 2)

    • image preview grid rendered directly in the terminal
    • generate up to 10 images at a time
    • automatic URL shortening
    • optionally save images and prompt to local disk
  • Markdown support for replies (requires glow)

  • Single turn Q&A with optional follow-up conversation

  • Data piping support (sending file contents to the LLM)

  • Full conversation support:

    • locally store unlimited conversations (in JSON format)
    • quick resume last conversation
    • delete/resume any stored conversation
    • conversation messages replay on resume
    • store current and start new conversation (reset history) during interactive sessions
    • Automatic conversation topic identification and update
  • Multiple chat models support:

    • switch to a different model mid-conversation
    • can freely combine local and remote models
    • a single tool to query any model served through an OpenAI-compatible API endpoint
  • Multiple system prompt support:

    • manage multiple system prompts to alter the base model behavior and persona

Full command line options (ai -h):

Manage models (add/delete/set default):

ai -m

Manage system prompts (add/delete/set default):

ai -s

Disable markdown rendering:

ai -n

Start a new interactive conversation [optionally overriding the default model/system prompt]:

ai [-m <model_name>|-s <sysprompt_name>]

Single turn Q&A (will ask you to continue interacting, otherwise quit after answer):

ai [-m <model_name>|-s <sysprompt_name>] "how many planets are there in the solar system?"

Generate one or more images (default 1, max 10):

ai -i [num] "a cute cat"

Submit data as part of the question:

cat file.txt | ai [-m <model_name>|-s <sysprompt_name>] "can you summarize the contents of this file?"

List saved conversations:

ai -l

Continue last conversation:

ai -c

Continue specific conversation:

ai -c <conversation_id>

Delete a specific conversation:

ai -d <conversation_id>

Delete selected conversations:

ai -d <conversation_id_start>-<conversation_id_end>

Delete all conversations:

rm "$HOME/.config/ai-bash/conversations.json"

Usage examples:

(Adding a model)

image

(Listing added models)

image

(Interaction and conversation resuming)

asciicast

(Image generation)

asciicast

(Input piping to stdin)

asciicast

Installation:

Prerequisites:
  • Install jq, curl, imagemagick, catimg, fzf

    • for e.g. Ubuntu: apt -y install jq curl imagemagick catimg fzf
  • Install glow for Markdown rendering support in your terminal

Script download:

Install the script by either cloning this repository or directly downloading to your $PATH, e.g.:

curl "https://raw.githubusercontent.com/nitefood/ai-bash/master/ai" > /usr/bin/ai && chmod 0755 /usr/bin/ai

About

AI - a simple commandline local/remote LLM chat client. Supports any local or remote OpenAI-compatible API endpoint (GPT4o, Gemini, Grok, OpenRouter, Ollama, LM Studio and more), model and system prompt management, multiple conversations support, automatic topic identification and stdin piping (sending files to LLM for inspection)

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages