You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We want to create a user-friendly chat-style GUI for running LLM inference via our Java engine (powered by TornadoVM), inspired by the layout shown in the attached screenshot.
Desired Features
Prompt Input
A text box where users can enter prompts for the LLM
Dropdown Selectors
Engine: Choose between TornadoVM, JVM, etc.
Model: Select the model file to use for inference between Llama, Mistral etc
File Picker
A "Browse" button to select .java files or model directories
Run Button
A button to trigger inference (e.g., using tornado-llama-opencv)
Output Display
A read-only area to show model responses and logs
Optional: System Monitoring Panel
Live GPU usage via nvtop or nvidia-smi
Live CPU & memory stats via htop or Java system metrics
Preferably, integrated with the core code base with JavaFX, or external with QT or python QT.