Skip to content

Agents using different LLMS #341

Answered by donzorro
donzorro asked this question in Q&A
Discussion options

You must be logged in to vote

i think i got it to work. this is an example on how to get ollama+deepseek to work

from praisonaiagents import Agent

llm_config = {
    "model": "ollama/deepseek-r1:8b",  # Model name without provider prefix
    
    # Core settings
    "temperature": 0.7,                # Controls randomness (like temperature)
    "timeout": 30,                 # Timeout in seconds
    "top_p": 0.9,                    # Nucleus sampling parameter
    "max_tokens": 1000,               # Max tokens in response
    
    # API settings (optional)
    "api_key": None,                 # Your API key (or use environment variable)
    "base_url": "http://localhost:11434/v1",                # Custom API endpoint…

Replies: 3 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Answer selected by MervinPraison
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@kranthi-cd
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants