Skip to content

BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.

License

Notifications You must be signed in to change notification settings

falkon2/BeautifyOllama

Repository files navigation

BeautifyOllama

BeautifyOllama Logo

A modern, beautiful web interface for Ollama AI models

Transform your local AI interactions with an elegant, feature-rich chat interface

GitHub Stars GitHub Forks GitHub Issues License

Next.js React TypeScript TailwindCSS

Demo β€’ Features β€’ Installation β€’ Usage β€’ Contributing β€’ Roadmap


πŸ“– About

BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.

⚠️ Early Development Notice
This project is in active development. Features and APIs may change. We welcome contributions and feedback from the community.

πŸŽ₯ Demo

Video Demo

BeautifyOllama Demo

beautifyollamavideocompressed.mp4

✨ Features

Current Features

  • 🎬 Animated Shine Borders - Eye-catching animated message borders with color cycling
  • πŸ“± Responsive Design - Mobile-first approach with seamless cross-device compatibility
  • πŸŒ™ Theme System - Dark/light mode with system preference detection
  • ⚑ Real-time Streaming - Live response streaming from Ollama models
  • 🎯 Clean Interface - Simplified message rendering focused on readability
  • πŸ”„ Model Management - Easy switching between available Ollama models
  • ⌨️ Smart Input - Keyboard shortcuts (Enter to send, Shift+Enter for newlines)
  • 🎨 Modern UI/UX - Glassmorphism effects and smooth micro-animations

🚧 Upcoming Features

  • πŸ” API Key Management - Secure storage and management of API credentials
  • πŸ’Ύ Conversation History - Persistent chat history with search functionality
  • πŸ”§ Advanced Settings - Customizable model parameters and system prompts
  • πŸ“ File Upload Support - Document and image processing capabilities
  • 🌐 Multi-language Support - Internationalization for global users
  • πŸ“Š Usage Analytics - Token usage tracking and conversation insights
  • πŸ”Œ Plugin System - Extensible architecture for third-party integrations
  • ☁️ Cloud Sync - Optional cloud backup for conversations and settings

πŸš€ Installation

Prerequisites

Ensure you have the following installed on your system:

  • Node.js (v18 or higher)
  • npm, yarn, or pnpm
  • Ollama (for local AI model serving)

Step 1: Install Ollama

# macOS
brew install ollama

# Linux
curl -fsSL https://ollama.ai/install.sh | sh

# Windows
# Download from https://ollama.ai/download

Step 2: Setup Ollama Models

# Start Ollama service
ollama serve

# Pull recommended models
ollama pull llama2
ollama pull codellama
ollama pull mistral

# Verify installation
ollama list

Step 3: Install BeautifyOllama

# Clone the repository
git clone https://github.com/falkon2/BeautifyOllama.git
cd BeautifyOllama

# Install dependencies
npm install
# or
yarn install
# or  
pnpm install

# Start development server
npm run dev
# or
yarn dev
# or
pnpm dev

Step 4: Access the Application

Open your browser and navigate to http://localhost:3000

πŸ”§ Configuration

Environment Variables

Create a .env.local file in the project root:

# Ollama Configuration
NEXT_PUBLIC_OLLAMA_API_URL=http://localhost:11434
NEXT_PUBLIC_DEFAULT_MODEL=llama2

# Feature Flags (Coming Soon)
NEXT_PUBLIC_ENABLE_ANALYTICS=false
NEXT_PUBLIC_ENABLE_CLOUD_SYNC=false

# API Keys (Future Feature)
# OPENAI_API_KEY=your_openai_key_here
# ANTHROPIC_API_KEY=your_anthropic_key_here

Advanced Configuration

For custom Ollama installations or advanced setups, modify the configuration in src/config/ollama.ts:

export const ollamaConfig = {
  apiUrl: process.env.NEXT_PUBLIC_OLLAMA_API_URL || 'http://localhost:11434',
  defaultModel: process.env.NEXT_PUBLIC_DEFAULT_MODEL || 'llama2',
  timeout: 30000,
  maxRetries: 3
}

πŸ“š Usage

Basic Chat Interface

  1. Start a Conversation: Type your message in the input field
  2. Send Messages: Press Enter or click the send button
  3. New Lines: Use Shift + Enter for multi-line messages
  4. Switch Models: Use the model selector in the sidebar
  5. Theme Toggle: Click the theme button to switch between light/dark modes

Mobile Usage

  • Access Sidebar: Tap the menu button on mobile devices
  • Touch Gestures: Swipe gestures for navigation
  • Responsive Layout: Optimized for all screen sizes

πŸ—οΈ Architecture

Technology Stack

Layer Technology Purpose
Frontend Next.js 15 + React 19 Modern React framework with App Router
Styling TailwindCSS 4 Utility-first CSS framework
Animation Framer Motion Smooth animations and transitions
Language TypeScript Type safety and developer experience
State Management React Hooks Local state management
Theme next-themes Dark/light mode functionality

Project Structure

beautifyollama/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ app/                    # Next.js App Router
β”‚   β”‚   β”œβ”€β”€ globals.css        # Global styles
β”‚   β”‚   β”œβ”€β”€ layout.tsx         # Root layout
β”‚   β”‚   └── page.tsx           # Home page
β”‚   β”œβ”€β”€ components/            # React components
β”‚   β”‚   β”œβ”€β”€ Chat.tsx          # Main chat interface
β”‚   β”‚   β”œβ”€β”€ ShineBorder.tsx   # Animated border component
β”‚   β”‚   β”œβ”€β”€ MarkdownRenderer.tsx
β”‚   β”‚   └── ui/               # Reusable UI components
β”‚   β”œβ”€β”€ config/               # Configuration files
β”‚   β”œβ”€β”€ hooks/                # Custom React hooks
β”‚   β”œβ”€β”€ lib/                  # Utility functions
β”‚   └── types/                # TypeScript type definitions
β”œβ”€β”€ public/                   # Static assets
β”œβ”€β”€ docs/                     # Documentation
└── tests/                    # Test files

🀝 Contributing

We welcome contributions from the community! BeautifyOllama is an early-stage project with lots of opportunities to make an impact.

Ways to Contribute

  1. πŸ› Bug Reports - Help us identify and fix issues
  2. πŸ’‘ Feature Requests - Suggest new functionality
  3. πŸ“ Code Contributions - Submit pull requests
  4. πŸ“š Documentation - Improve README, guides, and code comments
  5. 🎨 Design - UI/UX improvements and suggestions
  6. πŸ§ͺ Testing - Help test new features and edge cases

Development Setup

  1. Fork the repository on GitHub
  2. Clone your fork locally:
    git clone https://github.com/your-username/BeautifyOllama.git
    cd BeautifyOllama
  3. Create a branch for your feature:
    git checkout -b feature/your-feature-name
  4. Install dependencies:
    npm install
  5. Start development server:
    npm run dev

Contribution Guidelines

  • Code Style: Follow the existing code style and use TypeScript
  • Commits: Use conventional commit messages (feat:, fix:, docs:, etc.)
  • Testing: Add tests for new features when applicable
  • Documentation: Update README and inline comments for new features
  • Pull Requests: Provide clear descriptions and link related issues

Development Scripts

npm run dev          # Start development server
npm run build        # Build for production
npm run start        # Start production server
npm run lint         # Run ESLint
npm run type-check   # Run TypeScript compiler
npm test             # Run tests (when available)

πŸ›£οΈ Roadmap

Phase 1: Core Features (Current)

  • Basic chat interface
  • Ollama integration
  • Theme system
  • Responsive design
  • Enhanced error handling
  • Performance optimizations

Phase 2: Advanced Features (Next)

  • API key management system
  • Conversation history persistence
  • File upload and processing
  • Advanced model settings
  • Export/import conversations

Phase 3: Enterprise Features (Future)

  • Multi-user support
  • Cloud synchronization
  • Plugin architecture
  • Usage analytics
  • Advanced security features

Phase 4: Ecosystem (Long-term)

  • Mobile applications
  • Desktop applications
  • API for third-party integrations
  • Marketplace for extensions

πŸ“Š Project Status

Feature Status Priority
Core Chat βœ… Complete High
Theme System βœ… Complete High
Mobile Support βœ… Complete High
API Keys 🚧 In Progress High
File Upload πŸ“‹ Planned Medium
Cloud Sync πŸ“‹ Planned Low

πŸ› Troubleshooting

Common Issues

Ollama Connection Failed

# Check if Ollama is running
ollama serve

# Verify models are available
ollama list

# Test API endpoint
curl http://localhost:11434/api/tags

Build Errors

# Clear Next.js cache
rm -rf .next

# Reinstall dependencies
rm -rf node_modules package-lock.json
npm install

Hydration Errors

  • Clear browser cache and localStorage
  • Restart development server
  • Check for theme provider issues

Getting Help

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • Ollama Team - For the excellent local AI runtime
  • Next.js Team - For the amazing React framework
  • Vercel - For seamless deployment platform
  • TailwindCSS - For the utility-first CSS framework
  • Framer Motion - For beautiful animations
  • All Contributors - For making this project better

⭐ Star History

Star History Chart


Made with ❀️ by the BeautifyOllama team

⭐ Star us on GitHub β€’ πŸ› Report Bug β€’ πŸ’¬ Join Discussion

About

BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.

Topics

Resources

License

Stars

Watchers

Forks