-
Notifications
You must be signed in to change notification settings - Fork 42
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Feature Description
Support inference using locally hosted models through Ollama, via LiteLLM’s ollama provider. This would allow ISEK users to run models like deepseek-coder, llama3, or mistral entirely on their own machines without relying on external APIs.
Use Case
Many developers and researchers prefer running models locally for reasons such as:
- Data privacy and security (e.g., local biomedical or financial documents)
- Reduced inference latency
- Offline or low-connectivity environments
- Cost control (avoiding API token usage)
Integrating local Ollama models into ISEK would enable users to prototype agents, pipelines, and tools in a fully self-contained and reproducible environment.
Why This Is Beneficial for ISEK
- Aligns with ISEK’s goal of supporting flexible, extensible, and developer-friendly agent systems.
- Expands ISEK’s compatibility with open-source and non-cloud models.
- Enables offline and edge computing use cases.
- Enhances reproducibility for academic and enterprise users who prefer local execution.
Implementation Suggestions
- Extend LiteLLMModel to support
provider='ollama', allowing custommodel_idandbase_url, but noapi_keyis required. - Update PROVIDER_MAP with a minimal configuration for Ollama, explicitly omitting the
api_env_key:
"ollama": {
"model_env_key": "OLLAMA_MODEL",
"base_url_env_key": "OLLAMA_BASE_URL",
"default_model": "llama3" # or "deepseek-coder:6.7b", etc.
# Note: No api_env_key needed for Ollama
}
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request