Skip to content

This repository gives you simple options to interact with Ollama modles using CLI, a local GUI, or a hosted web-app. NOTE: This setup is intended for testing and personal use only. Exposing your local server via ngrok without additional security measures puts your data and privacy at considerable risk.

Notifications You must be signed in to change notification settings

myopicOracle/run-local-llm-with-gui

Repository files navigation

3 Ways to Ollama: A Beginner's Guide to the Galaxy

Overview: This repository gives you simple options to interact with Ollama modles using CLI, a local GUI, or a hosted web-app.

Skip to:


NOTE: This setup is intended for testing and personal use only. Exposing your local server via ngrok without additional security measures puts your data and privacy at considerable risk.


Walkthrough Highlights


Handling Raw Bytes Stream from Ollama API Endpoint

Walkthrough: Handling Raw Bytes Stream from Ollama API Endpoint
▶️ Watch on YouTube



Exposing Your Local API for Remote Access w/ ngrok

Walkthrough: Exposing Your Local API for Remote Access w/ ngrok
▶️ Watch on YouTube




Ollama Setup Guide

Skip to:




Option 1: CLI (▶️)

1. Download and Install

  • Visit ollama.com and install the application.
  • Confirm it’s running via Task Manager or system monitor.

Verify Installation


*Screenshot: Verifying installation*

2. Run a Model

ollama run <model_name>

Running a Model


*Screenshot: Running a model*

3. Manage Models

ollama pull <model_name>   # Download
ollama list                # List installed

To exit:

/buy

Option 2: Local GUI with Node.js (▶️)

1. Set Up Backend

mkdir backend
cd backend
npm init -y
npm install express axios

Create a server.js file (see repo for example).

Node.js Setup


*Screenshot: Setting up Node.js*

2. Start Servers

ollama serve       # Start Ollama backend
node server.js     # Start your Node.js API

Node Server Running


*Screenshot: Node server running*

3. Add Frontend

Create a simple public/index.html and open:

http://localhost:5000

Option 3: Remote GUI with Vercel (▶️)

1. Expose Local Server

ngrok http 5000

Take note of the ngrok URL.

ngrok Setup


*Screenshot: Setting up ngrok*

2. Update Frontend

Update your frontend code to use the ngrok URL (see repo).

3. Deploy on Vercel

  • Push the frontend repo to GitHub.
  • Deploy via Vercel.

4. Test Remote GUI

Access your live app via the Vercel-provided URL.

Remote GUI Test


*Screenshot: Testing remote GUI*






GIFs converted with https://www.freeconvert.com/convert/video-to-gif

May 23, 2025

About

This repository gives you simple options to interact with Ollama modles using CLI, a local GUI, or a hosted web-app. NOTE: This setup is intended for testing and personal use only. Exposing your local server via ngrok without additional security measures puts your data and privacy at considerable risk.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •