Skip to content

ebenezerdon/simple-webllm-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

9 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ€– Browser LLM Chat

An offline AI assistant that runs entirely in your browser using WebLLM and WebGPU! ✨

This application allows you to chat with LLMs directly in your browser without sending data to external servers. All processing happens locally on your device.

βš™οΈ Requirements

  • 🌐 A modern browser with WebGPU support:
    • Chrome 113+
    • Edge 113+
    • Firefox 118+
  • πŸ’» A device with sufficient GPU capabilities
  • πŸ’Ύ Approximately 1-4GB of storage space (depending on model size)

πŸš€ How to Use

  1. Simply open the index.html file in a supported browser
  2. Select a model from the dropdown menu
  3. Click "Load Model" and wait for the download to complete
  4. Start chatting with the AI! πŸ’¬

πŸ”§ Technical Details

This app uses:

  • 🧠 WebLLM library to run models in the browser
  • ⚑ WebGPU for hardware acceleration
  • πŸ—œοΈ Quantized models for efficient performance
  • πŸ“ Basic HTML, CSS, and JavaScript (no framework dependencies)

πŸ€– Models

  • SmolLM2 360M: πŸ₯ A very small model, great for basic tasks
  • Llama 3.1 8B: πŸ¦™ Medium-sized model with good capabilities
  • Phi 3.5 Mini: 🦊 Larger model with enhanced response quality

πŸ‘¨β€πŸ’» Development

Feel free to modify the app to suit your needs. The entire application is contained in a single HTML file for simplicity.

πŸ“œ License

This project is open source and available under the MIT License.

✨ Enhanced Version

For a more feature-rich implementation with additional functionality, check out:

  • WebLLM Offline AI Assistant - A more advanced version with:
    • πŸ–₯️ PC-themed desktop interface
    • πŸ’¬ Chat history support
    • πŸ—ƒοΈ IndexedDB caching
    • πŸ“ Logger
    • πŸ–±οΈ Draggable windows
    • πŸ”½ Taskbar and window controls
    • πŸ“± Responsive design for mobile and desktop

✨ Live demo: chat.ebenezerdon.com

About

An offline AI assistant that runs entirely in your browser using WebLLM and WebGPU.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages