Skip to content

Commit 61bd541

Browse files
committed
bump version to 1.1.0
1 parent ff1e122 commit 61bd541

File tree

6 files changed

+6
-36
lines changed

6 files changed

+6
-36
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -135,7 +135,7 @@ chat.ask "Tell me a story about a Ruby programmer" do |chunk|
135135
print chunk.content
136136
end
137137

138-
# Set personality or behavior with instructions (aka system prompts) - available from 1.1.0
138+
# Set personality or behavior with instructions (aka system prompts)
139139
chat.with_instructions "You are a friendly Ruby expert who loves to help beginners"
140140

141141
# Understand content in multiple forms
@@ -171,7 +171,7 @@ end
171171
# In a background job
172172
chat = Chat.create! model_id: "gpt-4o-mini"
173173

174-
# Set personality or behavior with instructions (aka system prompts) - they're persisted too! - available from 1.1.0
174+
# Set personality or behavior with instructions (aka system prompts) - they're persisted too!
175175
chat.with_instructions "You are a friendly Ruby expert who loves to help beginners"
176176

177177
chat.ask("What's your favorite Ruby gem?") do |chunk|

docs/guides/chat.md

Lines changed: 1 addition & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -43,11 +43,6 @@ claude_chat = RubyLLM.chat(model: 'claude-3-5-sonnet-20241022')
4343
chat.with_model('gemini-2.0-flash')
4444
```
4545

46-
{: .warning-title }
47-
> Coming in v1.1.0
48-
>
49-
> The following model aliases and provider selection features are available in the upcoming version.
50-
5146
RubyLLM supports model aliases, so you don't need to remember specific version numbers:
5247

5348
```ruby
@@ -69,11 +64,6 @@ See [Working with Models]({% link guides/models.md %}) for more details on model
6964

7065
## Instructions (aka System Prompts)
7166

72-
{: .warning-title }
73-
> Coming in v1.1.0
74-
>
75-
> chat.with_instructions is coming in 1.1.0. 1.0.x users should use `add_message role: system, content: PROMPT`
76-
7767
System prompts allow you to set specific instructions or context that guide the AI's behavior throughout the conversation. These prompts are not directly visible to the user but help shape the AI's responses:
7868

7969
```ruby
@@ -86,7 +76,7 @@ chat.with_instructions "You are a helpful Ruby programming assistant. Always inc
8676
# Now the AI will follow these instructions in all responses
8777
response = chat.ask "How do I handle file operations in Ruby?"
8878

89-
# You can add multiple system messages or update them during the conversation - available from 1.1.0
79+
# You can add multiple system messages or update them during the conversation
9080
chat.with_instructions "Always format your code using proper Ruby style conventions and include comments."
9181

9282
# System prompts are especially useful for:

docs/guides/models.md

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,6 @@ chat.with_model('claude-3-5-sonnet')
2929

3030
### Model Resolution
3131

32-
{: .warning-title }
33-
> Coming in v1.1.0
34-
>
35-
> Provider-Specific Match and Alias Resolution will be available in the next release.
36-
3732
When you specify a model, RubyLLM follows these steps to find it:
3833

3934
1. **Exact Match**: First tries to find an exact match for the model ID
@@ -66,11 +61,6 @@ chat = RubyLLM.chat(model: 'claude-3-5-sonnet', provider: 'bedrock')
6661

6762
### Model Aliases
6863

69-
{: .warning-title }
70-
> Coming in v1.1.0
71-
>
72-
> Alias Resolution will be available in the next release.
73-
7464
RubyLLM provides convenient aliases for popular models, so you don't have to remember specific version numbers:
7565

7666
```ruby

docs/guides/rails.md

Lines changed: 2 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -130,25 +130,20 @@ end
130130

131131
## Instructions (aka System Prompts)
132132

133-
{: .warning-title }
134-
> Coming in v1.1.0
135-
>
136-
> chat.with_instructions is coming in 1.1.0. 1.0.x users should use `chat.messages.create! role: system, content: PROMPT`
137-
138133
Instructions help guide the AI's behavior throughout a conversation. With Rails integration, these messages are automatically persisted just like regular chat messages:
139134

140135
```ruby
141136
# Create a new chat
142137
chat = Chat.create!(model_id: 'gpt-4o-mini')
143138

144-
# Add instructions (these are persisted) - available from 1.1.0
139+
# Add instructions (these are persisted)
145140
chat.with_instructions("You are a helpful Ruby programming assistant. Always include code examples in your responses and explain them line by line.")
146141

147142
# Ask questions - the AI will follow the instructions
148143
response = chat.ask("How do I handle file operations in Ruby?")
149144
puts response.content # Will include detailed code examples
150145

151-
# Add additional instructions - available from 1.1.0
146+
# Add additional instructions
152147
chat.with_instructions("Always format your code using proper Ruby style conventions and include comments.")
153148
# Both instructions are now persisted and active
154149

docs/index.md

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -21,11 +21,6 @@ A delightful Ruby way to work with AI through a unified interface to OpenAI, Ant
2121

2222
---
2323

24-
{: .warning-title }
25-
> Coming in v1.1.0
26-
>
27-
> Amazon Bedrock support is coming in v1.1.0
28-
2924
<div style="display: flex; align-items: center; flex-wrap: wrap; gap: 1em; margin-bottom: 1em">
3025
<img src="https://upload.wikimedia.org/wikipedia/commons/4/4d/OpenAI_Logo.svg" alt="OpenAI" height="40" width="120">
3126
<img src="https://upload.wikimedia.org/wikipedia/commons/7/78/Anthropic_logo.svg" alt="Anthropic" height="40" width="120">

lib/ruby_llm/version.rb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# frozen_string_literal: true
22

33
module RubyLLM
4-
VERSION = '1.1.0rc2'
4+
VERSION = '1.1.0'
55
end

0 commit comments

Comments
 (0)