Skip to content

A honeypot that uses an LLM to simulate a Linux terminal in the browser (xterm.js + FastAPI + Ollama).

Notifications You must be signed in to change notification settings

Anishrkhadka/honeytty

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HoneyTTY

LLM-as-terminal honeypot — Browser (xterm.js) ⇄ FastAPI ⇄ Ollama ⇄ LLM model

A tiny “LLM-as-terminal” emulator: a web terminal (xterm.js) talks to a FastAPI WebSocket backend, which in turn prompts an Ollama-hosted model to act like a Linux shell. The LLM generates realistic terminal output and maintains session state via a hidden state line. The backend stays dumb: it only sends prompts, parses replies, displays output, and stores history.

HoneyTTY

Why?

  • Build a convincing terminal without running real commands.
  • Keep the LLM in charge of behaviour and state; the backend doesn’t invent outputs.
  • Use as a honeypot or demo environment to observe command patterns safely.

Features

  • Minimal xterm.js UI.
  • FastAPI WebSocket endpoint.
  • Ollama streaming client.
  • History-first state: the LLM invents plausible listings/content when unknown, then stays consistent.
  • Simple, readable code with lots of comments.

Requirements

  • Python 3.10+

  • pip for dependencies

  • Ollama running somewhere you can reach (local or remote)

    • default endpoint: http://localhost:11434/api/generate
    • default model: gemma3:12b (configurable)
  • A modern browser (xterm.js is client-side only)

Quick start (Docker)

If you’ve added the Docker files:

docker compose up --build
# backend exposed on :8000
# frontend static site on whatever port you configured (e.g. :5000)

If you run Ollama in another container/host, set OLLAMA_API in the backend service environment accordingly.

Configuration

Environment variables (backend):

  • OLLAMA_API — e.g. http://localhost:11434/api/generate
  • LLM_MODEL — e.g. gemma3:12b, llama3

Change the prompt rules in backend/terminal.py (SYSTEM_PROMPT) to tweak behaviour. The initial seed and canonical listings live in Terminal._initial_seed() (also in terminal.py).

Security notes

  • This is not a real shell. It does not execute system commands.
  • Do not rely on it for security boundaries; treat it as an illusion for research/demos.
  • Avoid exposing it on the open internet without proper isolation, rate-limits, and logging.

📜 License

MIT License © 2025 HoneyTTY

Acknowledgements