Skip to content
This repository was archived by the owner on Jun 7, 2025. It is now read-only.

r26D/runpod_ollama_cuda

Repository files navigation

Ollama CUDA Container for RunPod

This project provides a Docker container setup for running Ollama with CUDA support on RunPod. It leverages the container templates and scripts from the RunPod Containers repository to ensure compatibility with the RunPod platform.

Disclaimer

This is an unofficial, community-driven project and is not affiliated with RunPod or Ollama. While this container is designed to work on RunPod's infrastructure, it is independently maintained and not officially supported by RunPod. This project was created to support personal development work with AI agents and is shared with the community in hopes that others might find it useful.

Overview

The goal of this project is to create an easy-to-use, production-ready Docker container that:

  • Runs Ollama with full CUDA support
  • Is optimized for deployment on RunPod's GPU infrastructure
  • Follows RunPod's container best practices and requirements
  • Provides a seamless experience for running large language models

Features

  • CUDA support for GPU acceleration
  • Integration with RunPod's infrastructure
  • Optimized for AI/ML workloads
  • Easy deployment and scaling

Related Projects

Getting Started

[Documentation and setup instructions will be added as the project develops]

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

[License information to be added]

About

No description, website, or topics provided.

Resources

License

MIT and 2 other licenses found

Licenses found

MIT
LICENSE
Unknown
LICENSE.ollama_proxy
Unknown
LICENSE.runpod

Stars

Watchers

Forks

Packages

No packages published