-
-
Notifications
You must be signed in to change notification settings - Fork 190
Rails Generator for RubyLLM Models #75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 28 commits
487d4e4
3577f31
24467aa
2d50909
6e68d36
dacb148
f469f60
844614b
131e51c
f40d1b4
5eeedf9
9d8c2d6
61d66ab
a239b34
8316ce5
cf4a560
492188f
52d761a
cc3d23f
9eba7bc
7cded01
21c4ee0
fda4df1
6270f22
f5454aa
5be36c1
1f8d3b0
a350738
1ed3c2e
f16858a
26bcf50
c352d5f
b064713
5a9b3b8
224b24e
1fe4320
882f3fc
4789f80
d2196bd
bf2244a
145d2d2
5982876
50befbe
9d92db3
4cc4054
3ecfeaf
d66c305
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -7,31 +7,67 @@ permalink: /guides/rails | |
--- | ||
|
||
# Rails Integration | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. remove empty line |
||
{: .no_toc } | ||
|
||
RubyLLM offers seamless integration with Ruby on Rails applications through helpers for ActiveRecord models. This allows you to easily persist chat conversations, including messages and tool interactions, directly in your database. | ||
{: .fs-6 .fw-300 } | ||
RubyLLM offers seamless integration with Ruby on Rails applications through helpers for ActiveRecord models. This allows you to easily persist chat conversations, including messages and tool interactions, directly in your database. {: .fs-6 .fw-300 } | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. stick to previous version |
||
|
||
## Table of contents | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. remove empty line |
||
{: .no_toc .text-delta } | ||
|
||
1. TOC | ||
{:toc} | ||
1. TOC {:toc} | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. stick to previous version |
||
|
||
--- | ||
|
||
After reading this guide, you will know: | ||
|
||
* How to set up ActiveRecord models for persisting chats and messages. | ||
* How to use `acts_as_chat` and `acts_as_message`. | ||
* How chat interactions automatically persist data. | ||
* A basic approach for integrating streaming responses with Hotwire/Turbo Streams. | ||
- How to set up ActiveRecord models for persisting chats and messages. | ||
- How to use `acts_as_chat` and `acts_as_message`. | ||
- How chat interactions automatically persist data. | ||
kieranklaassen marked this conversation as resolved.
Show resolved
Hide resolved
|
||
- A basic approach for integrating streaming responses with Hotwire/Turbo Streams. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. unnecessary change |
||
|
||
## Setup | ||
|
||
### Create Migrations | ||
### Using the Generator | ||
|
||
The simplest way to set up RubyLLM in your Rails application is to use the provided generator: | ||
|
||
```bash | ||
# Generate all necessary models, migrations, and configuration | ||
rails generate ruby_llm:install | ||
``` | ||
|
||
This will create: | ||
|
||
- A `Chat` model for storing chat sessions | ||
- A `Message` model for storing individual messages | ||
- A `ToolCall` model for storing tool calls | ||
- Migrations for all these models | ||
- A RubyLLM initializer | ||
|
||
If you need to customize model names to avoid namespace collisions, you can provide options: | ||
|
||
```bash | ||
rails generate ruby_llm:install \ | ||
--chat-model-name=Conversation \ | ||
--message-model-name=ChatMessage \ | ||
--tool-call-model-name=FunctionCall | ||
``` | ||
|
||
After running the generator, simply run the migrations: | ||
|
||
```bash | ||
rails db:migrate | ||
``` | ||
|
||
### Manual Setup (Alternative) | ||
|
||
If you prefer to set up the models manually, follow these steps: | ||
|
||
First, generate migrations for your `Chat` and `Message` models. You'll also need a `ToolCall` model if you plan to use [Tools]({% link guides/tools.md %}). | ||
#### Create Migrations | ||
|
||
Generate migrations for your `Chat` and `Message` models. You'll also need a `ToolCall` model if you plan to use [Tools]({% link guides/tools.md %}). | ||
|
||
```bash | ||
# Generate basic models and migrations | ||
|
@@ -88,7 +124,7 @@ end | |
|
||
Run the migrations: `rails db:migrate` | ||
|
||
### Set Up Models with `acts_as` Helpers | ||
#### Set Up Models with `acts_as` Helpers | ||
|
||
Include the RubyLLM helpers in your ActiveRecord models. | ||
|
||
|
@@ -121,13 +157,21 @@ class ToolCall < ApplicationRecord | |
end | ||
``` | ||
|
||
{: .note } | ||
The `acts_as` helpers primarily handle loading history and saving messages/tool calls related to the chat interaction. Add your application-specific logic (associations, validations, scopes, callbacks) as usual. | ||
{: .note } The `acts_as` helpers primarily handle loading history and saving messages/tool calls related to the chat interaction. Add your application-specific logic (associations, validations, scopes, callbacks) as usual. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. stick to previous version |
||
|
||
### Configure RubyLLM | ||
|
||
Ensure your RubyLLM configuration (API keys, etc.) is set up, typically in `config/initializers/ruby_llm.rb`. See the [Installation Guide]({% link installation.md %}) for details. | ||
|
||
```ruby | ||
# config/initializers/ruby_llm.rb | ||
RubyLLM.configure do |config| | ||
config.openai_api_key = ENV["OPENAI_API_KEY"] | ||
config.anthropic_api_key = ENV["ANTHROPIC_API_KEY"] | ||
# Add other provider keys as needed | ||
end | ||
``` | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. no need for this, we have the configuration guide for it |
||
## Basic Usage | ||
|
||
The `acts_as_chat` helper delegates common `RubyLLM::Chat` methods to your `Chat` model. When you call these methods on an ActiveRecord `Chat` instance, RubyLLM automatically handles persistence. | ||
|
@@ -181,6 +225,18 @@ You can combine `acts_as_chat` with streaming and Turbo Streams for real-time UI | |
Here's a simplified approach using a background job: | ||
|
||
```ruby | ||
# app/jobs/chat_job.rb | ||
class ChatJob < ApplicationJob | ||
def perform(chat_id, question) | ||
chat = Chat.find(chat_id) | ||
chat.ask(question) do |chunk| | ||
Turbo::StreamsChannel.broadcast_append_to( | ||
chat, target: "response", partial: "messages/chunk", locals: { chunk: chunk } | ||
) | ||
end | ||
end | ||
end | ||
|
||
# app/models/chat.rb | ||
class Chat < ApplicationRecord | ||
acts_as_chat | ||
|
@@ -235,33 +291,32 @@ end | |
<!-- Your form to submit new messages --> | ||
<%= form_with(url: chat_messages_path(@chat), method: :post) do |f| %> | ||
<%= f.text_area :content %> | ||
<%= f.submit "Send" %> | ||
<%= f.button "Send", disable_with: "Sending..." %> | ||
<% end %> | ||
|
||
<%# app/views/messages/_message.html.erb %> | ||
<%= turbo_frame_tag message do %> | ||
<div class="message <%= message.role %>"> | ||
<strong><%= message.role.capitalize %>:</strong> | ||
<%# Target div for streaming content %> | ||
<div id="<%= dom_id(message, "content") %>" style="display: inline;"> | ||
<div id="<%= dom_id(message, "content") %>" class="message-content"> | ||
<%# Render initial content if not streaming, otherwise job appends here %> | ||
<%= simple_format(message.content) %> | ||
</div> | ||
</div> | ||
<% end %> | ||
``` | ||
|
||
{: .note } | ||
This example shows the core idea. You'll need to adapt the broadcasting, targets, and partials for your specific UI needs (e.g., handling Markdown rendering, adding styling, showing typing indicators). See the [Streaming Responses Guide]({% link guides/streaming.md %}) for more on streaming itself. | ||
{: .note } This example shows the core idea. You'll need to adapt the broadcasting, targets, and partials for your specific UI needs (e.g., handling Markdown rendering, adding styling, showing typing indicators). See the [Streaming Responses Guide]({% link guides/streaming.md %}) for more on streaming itself. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. stick to previous version |
||
|
||
## Customizing Models | ||
|
||
Your `Chat`, `Message`, and `ToolCall` models are standard ActiveRecord models. You can add any other associations, validations, scopes, callbacks, or methods as needed for your application logic. The `acts_as` helpers provide the core persistence bridge to RubyLLM without interfering with other model behavior. | ||
|
||
## Next Steps | ||
|
||
* [Chatting with AI Models]({% link guides/chat.md %}) | ||
* [Using Tools]({% link guides/tools.md %}) | ||
* [Streaming Responses]({% link guides/streaming.md %}) | ||
* [Working with Models]({% link guides/models.md %}) | ||
* [Error Handling]({% link guides/error-handling.md %}) | ||
- [Chatting with AI Models]({% link guides/chat.md %}) | ||
- [Using Tools]({% link guides/tools.md %}) | ||
- [Streaming Responses]({% link guides/streaming.md %}) | ||
- [Working with Models]({% link guides/models.md %}) | ||
- [Error Handling]({% link guides/error-handling.md %}) |
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Wouldn't it be annoying to have a new README? I would delete this There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think the name "README" is a bit misleading here. This is the text generated AFTER you run the generator to explain what you did, whereas the readme for RubyLLM remains the same and untouched. Originally, this did overwrite the README, but has been corrected to be more of an ephemeral display post install. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, you can see it in the test added too. It's to give direction after using the generator. |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,75 @@ | ||
# RubyLLM Rails Setup Complete! | ||
|
||
Thanks for installing RubyLLM in your Rails application. Here's what was created: | ||
|
||
## Models | ||
|
||
- `<%= options[:chat_model_name] %>` - Stores chat sessions and their associated model ID | ||
- `<%= options[:message_model_name] %>` - Stores individual messages in a chat | ||
- `<%= options[:tool_call_model_name] %>` - Stores tool calls made by language models | ||
|
||
## Configuration Options | ||
|
||
The generator supports the following options to customize model names: | ||
|
||
```bash | ||
rails generate ruby_llm:install \ | ||
--chat-model-name=Conversation \ | ||
--message-model-name=ChatMessage \ | ||
--tool-call-model-name=FunctionCall | ||
``` | ||
|
||
This is useful when you need to avoid namespace collisions with existing models in your application. Table names will be automatically derived from the model names following Rails conventions. | ||
|
||
## Next Steps | ||
|
||
1. **Run migrations:** | ||
```bash | ||
rails db:migrate | ||
``` | ||
|
||
2. **Set your API keys** in `config/initializers/ruby_llm.rb` or using environment variables: | ||
```ruby | ||
# config/initializers/ruby_llm.rb | ||
RubyLLM.configure do |config| | ||
config.openai_api_key = ENV["OPENAI_API_KEY"] | ||
config.anthropic_api_key = ENV["ANTHROPIC_API_KEY"] | ||
# etc. | ||
end | ||
``` | ||
|
||
3. **Start using RubyLLM in your code:** | ||
```ruby | ||
# In a background job | ||
class ChatJob < ApplicationJob | ||
def perform(chat_id, question) | ||
chat = <%= options[:chat_model_name] %>.find(chat_id) | ||
chat.ask(question) do |chunk| | ||
Turbo::StreamsChannel.broadcast_append_to( | ||
chat, target: "response", partial: "messages/chunk", locals: { chunk: chunk } | ||
) | ||
end | ||
end | ||
end | ||
|
||
# Queue the job | ||
chat = <%= options[:chat_model_name] %>.create!(model_id: "gpt-4o-mini") | ||
ChatJob.perform_later(chat.id, "What's your favorite Ruby gem?") | ||
``` | ||
|
||
4. **For streaming responses** with ActionCable or Turbo: | ||
```ruby | ||
chat.ask("Tell me about Ruby on Rails") do |chunk| | ||
Turbo::StreamsChannel.broadcast_append_to( | ||
chat, target: "response", partial: "messages/chunk", locals: { chunk: chunk } | ||
) | ||
end | ||
``` | ||
|
||
## Advanced Usage | ||
|
||
- Add more fields to your models as needed | ||
- Customize the views to match your application design | ||
- Create a controller for chat interactions | ||
|
||
For more information, visit the [RubyLLM Documentation](https://github.com/crmne/ruby_llm) |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
class <%= options[:chat_model_name] %> < ApplicationRecord | ||
<%= acts_as_chat_declaration %> | ||
end |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
class Create<%= options[:chat_model_name].pluralize %> < ActiveRecord::Migration<%= migration_version %> | ||
def change | ||
create_table :<%= options[:chat_model_name].tableize %> do |t| | ||
t.string :model_id | ||
t.timestamps | ||
end | ||
end | ||
end |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
# This migration must be run AFTER create_<%= options[:chat_model_name].tableize %> and create_<%= options[:tool_call_model_name].tableize %> migrations | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Would it be possible to enforce this constraint with the timestamp of the migration? Disclaimer: I'm new to making templates for migrations. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. See comments on latest PR below |
||
# to ensure proper foreign key references | ||
class Create<%= options[:message_model_name].pluralize %> < ActiveRecord::Migration<%= migration_version %> | ||
def change | ||
create_table :<%= options[:message_model_name].tableize %> do |t| | ||
t.references :<%= options[:chat_model_name].tableize.singularize %>, null: false, foreign_key: true | ||
t.string :role | ||
t.text :content | ||
t.string :model_id | ||
t.integer :input_tokens | ||
t.integer :output_tokens | ||
t.references :<%= options[:tool_call_model_name].tableize.singularize %> | ||
t.timestamps | ||
end | ||
end | ||
end |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
<%#- # Migration for creating tool_calls table with database-specific JSON handling -%> | ||
class Create<%= options[:tool_call_model_name].pluralize %> < ActiveRecord::Migration<%= migration_version %> | ||
def change | ||
create_table :<%= options[:tool_call_model_name].tableize %> do |t| | ||
t.references :<%= options[:message_model_name].tableize.singularize %>, null: false, foreign_key: true | ||
t.string :tool_call_id, null: false | ||
t.string :name, null: false | ||
t.<%= postgresql? ? 'jsonb' : 'json' %> :arguments, default: {} | ||
t.timestamps | ||
end | ||
|
||
add_index :<%= options[:tool_call_model_name].tableize %>, :tool_call_id | ||
end | ||
end |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
# RubyLLM configuration | ||
RubyLLM.configure do |config| | ||
# Set your API keys here or use environment variables | ||
# config.openai_api_key = ENV["OPENAI_API_KEY"] | ||
# config.anthropic_api_key = ENV["ANTHROPIC_API_KEY"] | ||
# config.gemini_api_key = ENV["GEMINI_API_KEY"] | ||
# config.deepseek_api_key = ENV["DEEPSEEK_API_KEY"] | ||
|
||
# Uncomment to set a default model | ||
# config.default_model = "gpt-4o-mini" | ||
|
||
# Uncomment to set default options | ||
# config.default_options = { temperature: 0.7 } | ||
end |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
class <%= options[:message_model_name] %> < ApplicationRecord | ||
<%= acts_as_message_declaration %> | ||
end |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
class <%= options[:tool_call_model_name] %> < ApplicationRecord | ||
<%= acts_as_tool_call_declaration %> | ||
end |
Uh oh!
There was an error while loading. Please reload this page.