local-llm-npc is an interactive educational game built for the Google Gemma 3n Impact Challenge. It leverages Gemma 3n’s on-device, multimodal AI to deliver private, offline-first learning experiences in a beautiful garden setting. The project demonstrates how next-generation AI can revolutionize education, accessibility, and sustainability—especially in low-connectivity regions.
- Impact: Real-time, personalized agricultural education with privacy-first AI.
- Gemma 3n Integration: Uses Gemma 3n (via Ollama) for on-device, structured educational conversations.
- Text-based: Supports rich, hands-on learning through interactive text dialogue.
- Offline-Ready: Runs locally—no internet required for core features.
- Educational NPC: Teaches sustainable farming, botany, and more using Socratic dialogue and hands-on activities.
- Progress Tracking: Tracks learning checkpoints, completed topics, and assessments.
- Customizable AI Host: Easily set your Ollama/Gemma 3n endpoint in settings.
- Gemma 3n Model: Supports dynamic model selection and mix’n’match capabilities (set at build time; not changeable by the player).
- Godot Engine: Version 4.4.1 (Mono/C# build required)
Download Godot 4.4.1 Mono - .NET SDK: Version 8.0 or higher
Download .NET SDK - Ollama: Installed locally or on a LAN host
Ollama Setup
Ensure that the Ollama host is installed on your local machine or available on your LAN.
Linux
-
Install Ollama on your host (Ollama Setup).
-
Edit the systemd service file by running:
sudo nano /etc/systemd/system/ollama.service
-
Add the following environment variables in the
[Service]
section:Environment="OLLAMA_HOST=0.0.0.0"
Note: The
OLLAMA_HOST=0.0.0.0
setting is optional if the Ollama server is running on localhost and you do not need the Ollama server to be accessed from LAN. -
Save the file, then reload and restart the service:
sudo systemctl daemon-reload sudo systemctl restart ollama.service
Windows
-
Install Ollama on your host (Ollama Setup).
-
On the machine running Ollama, set the environment variables:
OLLAMA_HOST=0.0.0.0
You can do this via the System Properties or using PowerShell.
Note: The
OLLAMA_HOST=0.0.0.0
setting is optional if the Ollama server is running on localhost and you do not need the Ollama server to be accessed from LAN. -
Restart Ollama app.
After Ollama is installed and running, open a terminal and run:
ollama pull gemma3n:e4b # for the larger model
ollama pull gemma3n:e2b # for the smaller model
For Jetson Orin Nano and other resource-constrained devices, the larger Gemma 3n model (
gemma3n:e4b
) will require additional swap space.
Recommendation: For best performance, run Ollama from an SSD rather than an SD card.NOTE: I have tested this setup on a Jetson Orin Nano Super Developer Kit with the OS installed on the SSD + the updated swap space.
To add 8GB swap:
sudo fallocate -l 8G /swapfile # create 8GB swap file sudo chmod 600 /swapfile # restrict permissions sudo mkswap /swapfile # mark file as swap sudo swapon /swapfile # enable swap immediately swapon --show # verify swap is active sudo nano /etc/fstab # add '/swapfile swap swap defaults 0 0' line to auto-enable at bootTo install OS on the SSD:
Set your Ollama host URL in the game settings (e.g., http://localhost:11434
or the network IP where the Ollama is installed).
Option 1: Build from Source
-
Clone the Repository
git clone https://github.com/code-forge-temple/local-llm-npc.git cd local-llm-npc
-
Restore .NET Dependencies
dotnet restore
-
Open the Project in Godot
- Launch Godot 4.4.1 (Mono/C#).
- Open the project folder.
-
Build and Run
- Press Play in the Godot editor.
- Configure your Ollama host URL in the game
Option 2: Run Prebuilt Executable
- Clone the Repository
git clone https://github.com/code-forge-temple/local-llm-npc.git cd local-llm-npc
- Run the Executable
- For Windows:
OpenBIN/WINDOWS/local-llm-npc (4.4).exe
- For Linux:
OpenBIN/LINUX/local-llm-npc (4.4).x86_64
- For Windows:
Watch the project presentation here:
See PROJECT_ARCHITECTURE.md for a detailed overview of the main components and their responsibilities.
Sprites used in this project are from Kenney's asset store, released under Creative Commons CC0.
Audio files were taken from the Yellowstone National Park Sound Library, which are in the public domain:
"The files available here were recorded in the park and are in the public domain."
See LICENSE for details.