Skip to content

Commit 4aed628

Browse files
crmneOriPekelman
authored andcommitted
Model provider support.
1 parent 5924425 commit 4aed628

File tree

5 files changed

+66
-15
lines changed

5 files changed

+66
-15
lines changed

docs/_config.yml

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,18 @@ color_scheme: light
4040
ga_tracking:
4141
ga_tracking_anonymize_ip: true
4242

43+
# Callouts
44+
callouts:
45+
new:
46+
title: New
47+
color: green
48+
warning:
49+
title: Warning
50+
color: yellow
51+
note:
52+
title: Note
53+
color: blue
54+
4355
# Custom plugins (GitHub Pages allows these)
4456
plugins:
4557
- jekyll-remote-theme

docs/guides/chat.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,33 @@ claude_chat = RubyLLM.chat(model: 'claude-3-5-sonnet-20241022')
4343
chat.with_model('gemini-2.0-flash')
4444
```
4545

46+
{: .warning-title }
47+
> Coming in v1.1.0
48+
>
49+
> The following model aliases and provider selection features are available in the upcoming version.
50+
51+
RubyLLM supports model aliases, so you don't need to remember specific version numbers:
52+
53+
```ruby
54+
# Instead of this:
55+
chat = RubyLLM.chat(model: 'claude-3-5-sonnet-20241022')
56+
57+
# You can simply write:
58+
chat = RubyLLM.chat(model: 'claude-3-5-sonnet')
59+
```
60+
61+
You can also specify a specific provider to use with a model:
62+
63+
```ruby
64+
# Use a specific provider (when the same model is available from multiple providers)
65+
chat = RubyLLM.chat(model: 'claude-3-5-sonnet', provider: 'bedrock')
66+
67+
# Or set the provider after initialization
68+
chat = RubyLLM.chat(model: 'gpt-4o')
69+
.with_provider('azure')
70+
```
71+
72+
See [Working with Models]({% link guides/models.md %}) for more details on model selection.
4673
## System Prompts
4774

4875
System prompts allow you to set specific instructions or context that guide the AI's behavior throughout the conversation. These prompts are not directly visible to the user but help shape the AI's responses:

docs/guides/models.md

Lines changed: 13 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -65,6 +65,11 @@ deepseek_models = RubyLLM.models.by_provider('deepseek')
6565

6666
## Using Model Aliases
6767

68+
{: .warning-title }
69+
> Coming in v1.1.0
70+
>
71+
> This feature is available in the upcoming version but not in the latest release.
72+
6873
RubyLLM provides convenient aliases for popular models, so you don't have to remember specific version numbers:
6974

7075
```ruby
@@ -77,20 +82,16 @@ chat = RubyLLM.chat(model: 'gpt-4o')
7782
chat = RubyLLM.chat(model: 'gpt-4o-2024-11-20')
7883
```
7984

80-
Aliases for common models include:
85+
You can also specify a different provider to use with a model:
8186

82-
| Alias | Resolves To |
83-
|--------|-------------|
84-
| `claude-3-5-sonnet` | `claude-3-5-sonnet-20241022` |
85-
| `claude-3-5-haiku` | `claude-3-5-haiku-20241022` |
86-
| `claude-3-7-sonnet` | `claude-3-7-sonnet-20250219` |
87-
| `claude-3-opus` | `claude-3-opus-20240229` |
88-
| `gpt-4o` | `gpt-4o-2024-11-20` |
89-
| `gpt-4o-mini` | `gpt-4o-mini-2024-07-18` |
90-
| `gemini-1.5-flash` | `gemini-1.5-flash-002` |
91-
| `gemini-2.0-flash` | `gemini-2.0-flash-001` |
87+
```ruby
88+
# Use a specific model via a different provider
89+
chat = RubyLLM.chat(model: 'claude-3-5-sonnet', provider: 'bedrock')
9290

93-
Aliases are particularly useful when you want your code to always use the latest stable version of a model without having to update your codebase when providers release new model versions.
91+
# Or set the provider after initialization
92+
chat = RubyLLM.chat(model: 'gpt-4o')
93+
.with_provider('azure')
94+
```
9495

9596
## Chaining Filters
9697

lib/ruby_llm.rb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,8 +29,8 @@ module RubyLLM
2929
class Error < StandardError; end
3030

3131
class << self
32-
def chat(model: nil)
33-
Chat.new(model: model)
32+
def chat(model: nil, provider: nil)
33+
Chat.new(model: model, provider: provider)
3434
end
3535

3636
def embed(...)

lib/ruby_llm/chat.rb

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,10 @@ class Chat
1313

1414
attr_reader :model, :messages, :tools
1515

16-
def initialize(model: nil)
16+
def initialize(model: nil, provider: nil)
1717
model_id = model || RubyLLM.config.default_model
1818
self.model = model_id
19+
self.provider = provider if provider
1920
@temperature = 0.7
2021
@messages = []
2122
@tools = {}
@@ -52,6 +53,16 @@ def model=(model_id)
5253
@provider = Models.provider_for model_id
5354
end
5455

56+
def provider=(provider_slug)
57+
@provider = Provider.providers[provider_slug.to_sym] ||
58+
raise(Error, "Unknown provider: #{provider_slug}")
59+
end
60+
61+
def with_provider(provider_slug)
62+
self.provider = provider_slug
63+
self
64+
end
65+
5566
def with_model(model_id)
5667
self.model = model_id
5768
self

0 commit comments

Comments
 (0)