Skip to content

use different LLMs #458

Answered by joshua-mo-143
ifeue asked this question in Q&A
Discussion options

You must be logged in to vote

Hi, the current state of LLM clients means you will likely need to write an enum wrapper:

enum Provider {
    OpenAI(rig::provider::openai::Client),
    // .. the rest of your providers here
}

We're taking a stab at dealing with this exact issue in #440 but for now, this is probably the closest thing you're going to get to a solution.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by ifeue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants