Overview: This repository gives you simple options to interact with Ollama modles using CLI, a local GUI, or a hosted web-app.
Skip to:
NOTE: This setup is intended for testing and personal use only. Exposing your local server via ngrok without additional security measures puts your data and privacy at considerable risk.
Skip to:
Option 1: CLI (
▶️ )
- Visit ollama.com and install the application.
- Confirm it’s running via Task Manager or system monitor.
*Screenshot: Verifying installation*
ollama run <model_name>
*Screenshot: Running a model*
ollama pull <model_name> # Download
ollama list # List installed
To exit:
/buy
Option 2: Local GUI with Node.js (
▶️ )
mkdir backend
cd backend
npm init -y
npm install express axios
Create a server.js
file (see repo for example).
*Screenshot: Setting up Node.js*
ollama serve # Start Ollama backend
node server.js # Start your Node.js API
*Screenshot: Node server running*
Create a simple public/index.html
and open:
http://localhost:5000
Option 3: Remote GUI with Vercel (
▶️ )
ngrok http 5000
Take note of the ngrok URL.
*Screenshot: Setting up ngrok*
Update your frontend code to use the ngrok URL (see repo).
- Push the frontend repo to GitHub.
- Deploy via Vercel.
Access your live app via the Vercel-provided URL.
*Screenshot: Testing remote GUI*
GIFs converted with https://www.freeconvert.com/convert/video-to-gif
May 23, 2025