A native macOS application built with SwiftUI that provides a seamless interface for interacting with your local Ollama instance. This companion app makes it easy to manage and interact with your local large language models.
- 🚀 Native macOS app built with SwiftUI
- 🔄 Real-time interaction with local Ollama instance
- 💻 Clean and intuitive user interface
- 🔒 Secure local-only operations
- ⚡️ High-performance response handling
- 🎨 Modern macOS design patterns
- macOS 14.0 or later
- Ollama installed and running locally
- Xcode 15.0+ for development
- Clone the repository:
git clone https://github.com/yourusername/OllamaCompanion.git
- Open the project in Xcode:
cd OllamaCompanion
open OllamaCompanion.xcodeproj
- Build and run the project (⌘R)
The app connects to your local Ollama instance running at http://localhost:11434
by default. Make sure Ollama is running before launching the app.
- Base URL: http://localhost:11434
- Default Model: llama2
- Context Window: 4096
- Temperature: 0.7
- Max Tokens: 2048
The app follows modern Swift and SwiftUI best practices:
- MVVM Architecture: Clear separation of concerns with Views, ViewModels, and Models
- Swift Concurrency: Leveraging async/await for smooth performance
- SwiftUI: Built with native SwiftUI components for the best macOS experience
- Combine Framework: Reactive programming for state management
OllamaCompanion/
├── Views/ # SwiftUI views
├── ViewModels/ # Business logic and state management
├── Services/ # API and core services
├── Models/ # Data models
└── Assets/ # Resources and assets
- Swift and SwiftUI best practices
- Comprehensive documentation
- Unit and UI tests
- Modern error handling
- Performance optimizations
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for providing the amazing local LLM runtime
- The Swift and SwiftUI community for their invaluable resources
For support, please open an issue in the GitHub repository or contact the maintainers.
Made with ❤️ for the Ollama community