
A modern, beautiful web interface for Ollama AI models
Transform your local AI interactions with an elegant, feature-rich chat interface
Demo β’ Features β’ Installation β’ Usage β’ Contributing β’ Roadmap
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
β οΈ Early Development Notice
This project is in active development. Features and APIs may change. We welcome contributions and feedback from the community.
Video Demo
beautifyollamavideocompressed.mp4
- π¬ Animated Shine Borders - Eye-catching animated message borders with color cycling
- π± Responsive Design - Mobile-first approach with seamless cross-device compatibility
- π Theme System - Dark/light mode with system preference detection
- β‘ Real-time Streaming - Live response streaming from Ollama models
- π― Clean Interface - Simplified message rendering focused on readability
- π Model Management - Easy switching between available Ollama models
- β¨οΈ Smart Input - Keyboard shortcuts (Enter to send, Shift+Enter for newlines)
- π¨ Modern UI/UX - Glassmorphism effects and smooth micro-animations
- π API Key Management - Secure storage and management of API credentials
- πΎ Conversation History - Persistent chat history with search functionality
- π§ Advanced Settings - Customizable model parameters and system prompts
- π File Upload Support - Document and image processing capabilities
- π Multi-language Support - Internationalization for global users
- π Usage Analytics - Token usage tracking and conversation insights
- π Plugin System - Extensible architecture for third-party integrations
- βοΈ Cloud Sync - Optional cloud backup for conversations and settings
Ensure you have the following installed on your system:
- Node.js (v18 or higher)
- npm, yarn, or pnpm
- Ollama (for local AI model serving)
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.ai/install.sh | sh
# Windows
# Download from https://ollama.ai/download
# Start Ollama service
ollama serve
# Pull recommended models
ollama pull llama2
ollama pull codellama
ollama pull mistral
# Verify installation
ollama list
# Clone the repository
git clone https://github.com/falkon2/BeautifyOllama.git
cd BeautifyOllama
# Install dependencies
npm install
# or
yarn install
# or
pnpm install
# Start development server
npm run dev
# or
yarn dev
# or
pnpm dev
Open your browser and navigate to http://localhost:3000
Create a .env.local
file in the project root:
# Ollama Configuration
NEXT_PUBLIC_OLLAMA_API_URL=http://localhost:11434
NEXT_PUBLIC_DEFAULT_MODEL=llama2
# Feature Flags (Coming Soon)
NEXT_PUBLIC_ENABLE_ANALYTICS=false
NEXT_PUBLIC_ENABLE_CLOUD_SYNC=false
# API Keys (Future Feature)
# OPENAI_API_KEY=your_openai_key_here
# ANTHROPIC_API_KEY=your_anthropic_key_here
For custom Ollama installations or advanced setups, modify the configuration in src/config/ollama.ts
:
export const ollamaConfig = {
apiUrl: process.env.NEXT_PUBLIC_OLLAMA_API_URL || 'http://localhost:11434',
defaultModel: process.env.NEXT_PUBLIC_DEFAULT_MODEL || 'llama2',
timeout: 30000,
maxRetries: 3
}
- Start a Conversation: Type your message in the input field
- Send Messages: Press
Enter
or click the send button - New Lines: Use
Shift + Enter
for multi-line messages - Switch Models: Use the model selector in the sidebar
- Theme Toggle: Click the theme button to switch between light/dark modes
- Access Sidebar: Tap the menu button on mobile devices
- Touch Gestures: Swipe gestures for navigation
- Responsive Layout: Optimized for all screen sizes
Layer | Technology | Purpose |
---|---|---|
Frontend | Next.js 15 + React 19 | Modern React framework with App Router |
Styling | TailwindCSS 4 | Utility-first CSS framework |
Animation | Framer Motion | Smooth animations and transitions |
Language | TypeScript | Type safety and developer experience |
State Management | React Hooks | Local state management |
Theme | next-themes | Dark/light mode functionality |
beautifyollama/
βββ src/
β βββ app/ # Next.js App Router
β β βββ globals.css # Global styles
β β βββ layout.tsx # Root layout
β β βββ page.tsx # Home page
β βββ components/ # React components
β β βββ Chat.tsx # Main chat interface
β β βββ ShineBorder.tsx # Animated border component
β β βββ MarkdownRenderer.tsx
β β βββ ui/ # Reusable UI components
β βββ config/ # Configuration files
β βββ hooks/ # Custom React hooks
β βββ lib/ # Utility functions
β βββ types/ # TypeScript type definitions
βββ public/ # Static assets
βββ docs/ # Documentation
βββ tests/ # Test files
We welcome contributions from the community! BeautifyOllama is an early-stage project with lots of opportunities to make an impact.
- π Bug Reports - Help us identify and fix issues
- π‘ Feature Requests - Suggest new functionality
- π Code Contributions - Submit pull requests
- π Documentation - Improve README, guides, and code comments
- π¨ Design - UI/UX improvements and suggestions
- π§ͺ Testing - Help test new features and edge cases
- Fork the repository on GitHub
- Clone your fork locally:
git clone https://github.com/your-username/BeautifyOllama.git cd BeautifyOllama
- Create a branch for your feature:
git checkout -b feature/your-feature-name
- Install dependencies:
npm install
- Start development server:
npm run dev
- Code Style: Follow the existing code style and use TypeScript
- Commits: Use conventional commit messages (
feat:
,fix:
,docs:
, etc.) - Testing: Add tests for new features when applicable
- Documentation: Update README and inline comments for new features
- Pull Requests: Provide clear descriptions and link related issues
npm run dev # Start development server
npm run build # Build for production
npm run start # Start production server
npm run lint # Run ESLint
npm run type-check # Run TypeScript compiler
npm test # Run tests (when available)
- Basic chat interface
- Ollama integration
- Theme system
- Responsive design
- Enhanced error handling
- Performance optimizations
- API key management system
- Conversation history persistence
- File upload and processing
- Advanced model settings
- Export/import conversations
- Multi-user support
- Cloud synchronization
- Plugin architecture
- Usage analytics
- Advanced security features
- Mobile applications
- Desktop applications
- API for third-party integrations
- Marketplace for extensions
Feature | Status | Priority |
---|---|---|
Core Chat | β Complete | High |
Theme System | β Complete | High |
Mobile Support | β Complete | High |
API Keys | π§ In Progress | High |
File Upload | π Planned | Medium |
Cloud Sync | π Planned | Low |
Ollama Connection Failed
# Check if Ollama is running
ollama serve
# Verify models are available
ollama list
# Test API endpoint
curl http://localhost:11434/api/tags
Build Errors
# Clear Next.js cache
rm -rf .next
# Reinstall dependencies
rm -rf node_modules package-lock.json
npm install
Hydration Errors
- Clear browser cache and localStorage
- Restart development server
- Check for theme provider issues
- π Documentation
- π¬ GitHub Discussions
- π Issue Tracker
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama Team - For the excellent local AI runtime
- Next.js Team - For the amazing React framework
- Vercel - For seamless deployment platform
- TailwindCSS - For the utility-first CSS framework
- Framer Motion - For beautiful animations
- All Contributors - For making this project better
Made with β€οΈ by the BeautifyOllama team
β Star us on GitHub β’ π Report Bug β’ π¬ Join Discussion