Skip to content

kushagra-18/multi-model-chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

1 Commit
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Multi-Model Chat

A powerful Next.js application for simultaneously chatting with multiple AI models and comparing their responses in real-time. Built with modern web technologies and featuring a beautiful, responsive UI.

Created by Kushagra

Multi-Model Chat Demo

โœจ Features

๐Ÿค– Multi-Model Support

  • OpenAI GPT-5 - OpenAI's most advanced reasoning model
  • Claude 4 Sonnet - Anthropic's most capable model with enhanced reasoning
  • DeepSeek V3 - DeepSeek's latest reasoning model
  • Gemini 2.0 Flash - Google's next-generation multimodal AI
  • Perplexity Sonar - Online search-powered AI model

๐ŸŽฏ Core Functionality

  • Multi-Model Chat: Compare responses from multiple AI models simultaneously
  • Single Model Mode: Focus on conversation with one specific model
  • Chat Sessions: Save and manage conversation history
  • Fork Conversations: Branch off specific model responses into single-model chats
  • Real-time Streaming: See responses as they're generated
  • Markdown Support: Proper formatting for code, lists, and rich text

๐ŸŽจ User Experience

  • Dark/Light Mode: Toggle between themes
  • Responsive Design: Works seamlessly on desktop, tablet, and mobile
  • Horizontal Scrolling: Navigate between model responses easily
  • Analytics Dashboard: Track API usage and model performance
  • Model Logos: Visual identification for each AI provider

๐Ÿ’พ Data Management

  • SQLite Database: Local storage for chat sessions and messages
  • Upstash Redis: Analytics and caching layer
  • Session Management: Organize and search through chat history
  • Export Options: Save conversations for future reference

๐Ÿš€ Quick Start

Prerequisites

  • Node.js 18+ and npm
  • API Keys from AI providers (see Environment Variables section)
  • Upstash Redis account (optional, for analytics)

Installation

  1. Clone the repository

    git clone https://github.com/kushagra-18/multi-model-chat.git
    cd multi-model-chat
  2. Install dependencies

    npm install
  3. Set up environment variables

    cp .env.example .env

    Edit .env and add your API keys (see Environment Variables section below).

  4. Initialize the database The SQLite database will be created automatically when you start the application.

  5. Start the development server

    npm run dev
  6. Open your browser Navigate to http://localhost:3000

๐Ÿ”ง Environment Variables

Create a .env file in the root directory with the following variables:

Required API Keys

Variable Provider Get From Required
OPENAI_API_KEY OpenAI Platform Dashboard โŒ
ANTHROPIC_API_KEY Anthropic Console โŒ
GOOGLE_GENERATIVE_AI_API_KEY Google AI Studio โœ…
DEEPSEEK_API_KEY DeepSeek Platform โŒ
PERPLEXITY_API_KEY Perplexity Settings โŒ

Optional Services

Variable Service Get From Purpose
KV_REST_API_URL Upstash Redis Console Analytics
KV_REST_API_TOKEN Upstash Redis Console Analytics

Note: At minimum, you need GOOGLE_GENERATIVE_AI_API_KEY as all requests are currently shadowed to Gemini 2.0 Flash for demo purposes. Other API keys enable the full multi-model experience.

๐Ÿ—๏ธ Project Structure

multi-model-chat/
โ”œโ”€โ”€ app/                    # Next.js app router
โ”‚   โ”œโ”€โ”€ api/               # API routes
โ”‚   โ”‚   โ”œโ”€โ”€ chat/          # Chat endpoints
โ”‚   โ”‚   โ”œโ”€โ”€ chats/         # Session management
โ”‚   โ”‚   โ””โ”€โ”€ analytics/     # Usage tracking
โ”‚   โ”œโ”€โ”€ globals.css        # Global styles
โ”‚   โ””โ”€โ”€ page.tsx           # Main application
โ”œโ”€โ”€ components/            # React components
โ”‚   โ”œโ”€โ”€ ui/               # Reusable UI components
โ”‚   โ””โ”€โ”€ chat/             # Chat-specific components
โ”œโ”€โ”€ lib/                  # Utilities and configurations
โ”‚   โ”œโ”€โ”€ models.ts         # AI model definitions
โ”‚   โ”œโ”€โ”€ ai-providers.ts   # Provider clients
โ”‚   โ””โ”€โ”€ database.ts       # SQLite operations
โ”œโ”€โ”€ types/                # TypeScript type definitions
โ””โ”€โ”€ public/               # Static assets

๐ŸŽฎ Usage Guide

Multi-Model Chat Mode

  1. Enter your message in the input field at the bottom
  2. Press Enter or click Send to send to all enabled models
  3. Compare responses side by side in the grid layout
  4. Scroll horizontally to view all model responses
  5. Use action buttons to copy, like, or fork responses

Single Model Mode

  1. Click the "Single Model" button in the header
  2. Select your preferred model from the dropdown
  3. Have a focused conversation with one AI model
  4. Switch back to multi-model anytime

Managing Conversations

  • Save Sessions: Conversations are automatically saved
  • Access History: Click the history icon to view past chats
  • Fork Conversations: Click the fork button on any model response to create a new single-model session
  • Search Chats: Use the sidebar to find specific conversations

๐Ÿ”Œ API Endpoints

Chat Endpoints

  • POST /api/chat/multi - Multi-model chat
  • POST /api/chat - Single model chat (streaming)

Session Management

  • GET /api/chats - List all chat sessions
  • POST /api/chats - Create new session
  • GET /api/chats/[id] - Get specific session
  • POST /api/chats/[id]/messages - Add message to session

Analytics

  • POST /api/analytics/track - Track usage events
  • GET /api/analytics/model-stats - Get model statistics

๐Ÿงช Development

Running Tests

npm test

Building for Production

npm run build
npm start

Linting

npm run lint

๐Ÿ› Known Issues & Limitations

Current Bugs

  • Fork Button Limitation: Fork functionality currently only shows toast notification; doesn't switch to single-model mode automatically
  • Model Selection: All API calls are shadowed to Gemini 2.0 Flash for demo purposes
  • Timestamp Type Mismatch: Some TypeScript warnings related to Date vs number types
  • Session Loading: Forked sessions may not load conversation history in single-model mode

Design Limitations

  • Horizontal Scroll: Grid layout may not work perfectly on all screen sizes
  • Message Overflow: Long messages might overflow card boundaries in some cases
  • Real-time Updates: Analytics refresh every 30 seconds, not real-time
  • Mobile Experience: Some UI elements may be cramped on very small screens

Performance Considerations

  • SQLite: May not scale for high-traffic deployments (consider PostgreSQL)
  • API Rate Limits: No built-in rate limiting for AI provider APIs
  • Memory Usage: Long conversations may impact browser performance
  • Concurrent Requests: Multiple model requests happen in parallel (may hit rate limits)

Missing Features

  • User Authentication: No login system implemented
  • Conversation Export: No built-in export to PDF/Word functionality
  • Custom Models: Can't add custom AI models without code changes
  • File Uploads: No support for document or image uploads
  • Voice Input: No speech-to-text functionality

๐Ÿค Contributing

We welcome contributions! Here's how you can help:

Reporting Issues

  • Use the GitHub Issues page
  • Include steps to reproduce the bug
  • Add screenshots or screen recordings if helpful
  • Specify your browser and operating system

Pull Requests

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Make your changes with proper commit messages
  4. Add tests if applicable
  5. Submit a pull request with a clear description

Development Setup

# Clone your fork
git clone https://github.com/your-username/multi-model-chat.git
cd multi-model-chat

# Install dependencies
npm install

# Create environment file
cp .env.example .env
# Add your API keys

# Start development server
npm run dev

๐Ÿ“„ License

This project is open source and available under the MIT License.

๐Ÿ™ Acknowledgments

๐Ÿ†˜ Support

๐Ÿ—บ๏ธ Roadmap

Short Term (Next Release)

  • Fix fork-to-single-model functionality
  • Improve mobile responsiveness
  • Add conversation export features
  • Implement proper error boundaries

Medium Term

  • Add user authentication system
  • Support for file uploads (images, documents)
  • Custom model integration
  • Advanced analytics dashboard

Long Term

  • Voice input/output support
  • Collaborative chat sessions
  • Plugin system for extensions
  • Self-hosted deployment guides

๐Ÿ‘จโ€๐Ÿ’ป About the Creator

Kushagra is a passionate developer creating innovative AI applications and tools.


Made with โค๏ธ for the AI community

Star โญ this repo if you found it helpful!

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published