Skip to content

Add Azure OpenAI provider #279

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 6 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions lib/ruby_llm.rb
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@
'ruby_llm' => 'RubyLLM',
'llm' => 'LLM',
'openai' => 'OpenAI',
'azure_openai' => 'AzureOpenAI',
'api' => 'API',
'deepseek' => 'DeepSeek',
'bedrock' => 'Bedrock',
Expand Down Expand Up @@ -85,6 +86,7 @@ def logger
RubyLLM::Provider.register :openrouter, RubyLLM::Providers::OpenRouter
RubyLLM::Provider.register :ollama, RubyLLM::Providers::Ollama
RubyLLM::Provider.register :gpustack, RubyLLM::Providers::GPUStack
RubyLLM::Provider.register :azure_openai, RubyLLM::Providers::AzureOpenAI

if defined?(Rails::Railtie)
require 'ruby_llm/railtie'
Expand Down
4 changes: 4 additions & 0 deletions lib/ruby_llm/configuration.rb
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,10 @@ class Configuration
:ollama_api_base,
:gpustack_api_base,
:gpustack_api_key,
# Azure OpenAI Provider configuration
:azure_openai_api_base,
:azure_openai_api_version,
:azure_openai_api_key,
# Default models
:default_model,
:default_embedding_model,
Expand Down
43 changes: 43 additions & 0 deletions lib/ruby_llm/providers/azure_openai.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# frozen_string_literal: true

module RubyLLM
module Providers
# Azure OpenAI API integration. Derived from OpenAI integration to support
# OpenAI capabilities via Microsoft Azure endpoints.
module AzureOpenAI
extend OpenAI
extend AzureOpenAI::Chat
extend AzureOpenAI::Streaming
extend AzureOpenAI::Models

module_function

def api_base(config)
# https://<ENDPOINT>/openai/deployments/<MODEL>/chat/completions?api-version=<APIVERSION>
"#{config.azure_openai_api_base}/openai"
end

def headers(config)
{
'Authorization' => "Bearer #{config.azure_openai_api_key}"
}.compact
end

def capabilities
OpenAI::Capabilities
end

def slug
'azure_openai'
end

def configuration_requirements
%i[azure_openai_api_key azure_openai_api_base azure_openai_api_version]
end

def local?
false
end
end
end
end
31 changes: 31 additions & 0 deletions lib/ruby_llm/providers/azure_openai/chat.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# frozen_string_literal: true

module RubyLLM
module Providers
module AzureOpenAI
# Chat methods of the Azure OpenAI API integration
module Chat
extend OpenAI::Chat

module_function

def sync_response(connection, payload)
# Hold config in instance variable for use in completion_url and stream_url
@config = connection.config
super
end

def completion_url
# https://<ENDPOINT>/openai/deployments/<MODEL>/chat/completions?api-version=<APIVERSION>
"deployments/#{@model_id}/chat/completions?api-version=#{@config.azure_openai_api_version}"
end

def render_payload(messages, tools:, temperature:, model:, stream: false)
# Hold model_id in instance variable for use in completion_url and stream_url
@model_id = model
super
end
end
end
end
end
33 changes: 33 additions & 0 deletions lib/ruby_llm/providers/azure_openai/models.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# frozen_string_literal: true

module RubyLLM
module Providers
module AzureOpenAI
# Models methods of the OpenAI API integration
module Models
extend OpenAI::Models

KNOWN_MODELS = [
'gpt-4o'
].freeze

module_function

def models_url
'models?api-version=2024-10-21'
end

def parse_list_models_response(response, slug, capabilities)
# select the known models only since this list from Azure OpenAI is
# very long
response.body['data'].select! do |m|
KNOWN_MODELS.include?(m['id'])
end
# Use the OpenAI processor for the list, keeping in mind that pricing etc
# won't be correct
super
end
end
end
end
end
20 changes: 20 additions & 0 deletions lib/ruby_llm/providers/azure_openai/streaming.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# frozen_string_literal: true

module RubyLLM
module Providers
module AzureOpenAI
# Streaming methods of the Azure OpenAI API integration
module Streaming
extend OpenAI::Streaming

module_function

def stream_response(connection, payload, &)
# Hold config in instance variable for use in completion_url and stream_url
@config = connection.config
super
end
end
end
end
end
7 changes: 7 additions & 0 deletions lib/tasks/models_update.rake
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,17 @@ def configure_from_env
config.deepseek_api_key = ENV.fetch('DEEPSEEK_API_KEY', nil)
config.openrouter_api_key = ENV.fetch('OPENROUTER_API_KEY', nil)
configure_bedrock(config)
configure_azure_openai(config)
config.request_timeout = 30
end
end

def configure_azure_openai(config)
config.azure_openai_api_base = ENV.fetch('AZURE_OPENAI_ENDPOINT', nil)
config.azure_openai_api_key = ENV.fetch('AZURE_OPENAI_API_KEY', nil)
config.azure_openai_api_version = ENV.fetch('AZURE_OPENAI_API_VER', nil)
end

def configure_bedrock(config)
config.bedrock_api_key = ENV.fetch('AWS_ACCESS_KEY_ID', nil)
config.bedrock_secret_key = ENV.fetch('AWS_SECRET_ACCESS_KEY', nil)
Expand Down