Skip to content

Build GUI Chatbox for GPULlama3.java Inference with embedded GPU resource monitoring #24

@mikepapadim

Description

@mikepapadim

Description:

We want to create a user-friendly chat-style GUI for running LLM inference via our Java engine (powered by TornadoVM), inspired by the layout shown in the attached screenshot.

Desired Features

  • Prompt Input

    • A text box where users can enter prompts for the LLM
  • Dropdown Selectors

    • Engine: Choose between TornadoVM, JVM, etc.
    • Model: Select the model file to use for inference between Llama, Mistral etc
  • File Picker

    • A "Browse" button to select .java files or model directories
  • Run Button

    • A button to trigger inference (e.g., using tornado-llama-opencv)
  • Output Display

    • A read-only area to show model responses and logs
    • Optional: System Monitoring Panel
    • Live GPU usage via nvtop or nvidia-smi
    • Live CPU & memory stats via htop or Java system metrics

Preferably, integrated with the core code base with JavaFX, or external with QT or python QT.

Example POC:

Image

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions