Skip to content

Add Azure OpenAI provider #279

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

oxaroky02
Copy link

@oxaroky02 oxaroky02 commented Jul 9, 2025

What this does

Adds Azure OpenAI provider, derived from the existing OpenAI provider.

Provider can be configured as follows:

    context =  RubyLLM.context do |config|
      config.azure_openai_api_base = ENV.fetch('AZURE_OPENAI_URI')
      config.azure_openai_api_key = ENV.fetch('AZURE_OPENAI_API_KEY', nil)
      config.azure_openai_api_version = ENV.fetch('AZURE_OPENAI_API_VER', nil)
      config.default_model = ENV.fetch('AZURE_OPENAI_MODEL') || 'gpt-4o'
    end

    chat = context.chat(
      provider: :azure_openai,
      model: ENV.fetch('AZURE_OPENAI_MODEL') || 'gpt-4o',
      assume_model_exists: true)

    chat.ask "Hello!"

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Related issues

@oxaroky02
Copy link
Author

Hola @crmne. Before I go further, can you please take a look and let me know if I'm on the right track?

I had to overload some methods to "steal" model ID and config properties because of the way Azure OpenAI API URLs get constructed using model and API version.

@crmne crmne added the new provider New provider integration label Jul 16, 2025
@oxaroky02
Copy link
Author

@crmne, apologies if I am pestering you. I've been using the new provider with my project where I have tested #ask extensively, including tool-use which is a key to my app. My app uses both ollama and azure_openai and I have switched between them to ensure behaviour is consistent.

I don't fully understand how / what I should do for the RSpec tests. Let me know how you would like me to proceed, or if you are comfortable with accepting this PR as is.

@crmne crmne linked an issue Jul 18, 2025 that may be closed by this pull request
@crmne
Copy link
Owner

crmne commented Jul 18, 2025

Hi @oxaroky02 just add your models to the correct lists in spec_helper.rb and run rspec . It should create the correct VCR cassettes.

@oxaroky02 oxaroky02 mentioned this pull request Jul 18, 2025
@oxaroky02
Copy link
Author

just add your models to the correct lists in spec_helper.rb and run rspec . It should create the correct VCR cassettes.

@crmne I've configured spec_helper.rb but it looks like I need an active postgres DB and some config to run the tests. I installed postgres locally, I ran bundle add pg to make that available, and now I'm running into some missing config in the DB. Sorry about this, I typically don't use Postgres locally. Is there a way for me to run the tests without needing Postgres?

@oxaroky02
Copy link
Author

Hmm, looks like I need to figure out the Postgres config for the commit hooks to work. OK, no worries, I'll work on that.

@t2
Copy link

t2 commented Jul 22, 2025

@oxaroky02 what errors are you getting with PG? I may be able to help. This PR is necessary for us to utilize RubyLLM without patching.

@t2
Copy link

t2 commented Jul 22, 2025

@oxaroky02 also, could we add a config for always assuming model exists? With Azure we have many models names and would love to just use the standard RubyLLM.chat syntax without the options each time.

@oxaroky02
Copy link
Author

@oxaroky02 what errors are you getting with PG? I may be able to help. This PR is necessary for us to utilize RubyLLM without patching.

I tried running postgres after a brew install and looks I get an error about missing profle. I "assumed" the defaults would be sufficient but I think I need to be less hasty. 😀 I'm going to set up a container instead and see if I can get it to work.

@t2
Copy link

t2 commented Jul 22, 2025

@oxaroky02 do you have a Mac? If so, simplest way to get PG running is installing https://postgresapp.com/. It'll give you what you need and then you can remove if you don't need it anymore.

@oxaroky02
Copy link
Author

oxaroky02 commented Jul 22, 2025

@t2, I have postgres running in a container and I have it exposed on 5431 which the tests seem to require. I'm using a dead simple default docker compose file with following defaults for postgres:

      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: password
      POSTGRES_DB: testdb

I'm getting following exception:

ActiveRecord::ConnectionNotEstablished:
  connection to server at "127.0.0.1", port 5431 failed: fe_sendauth: no password supplied

I see the spec helper is loading a .env file but there's none in the repo. So I tried adding one, and then added an ActiveRecord::Base.establish_connection... call in the spec helper. At this point I'm just making stuff up because I don't understand what I'm missing.

Questions:

  1. Is there some local env / vars I can set to configure the password before running the tests?
  2. Is there a default database I need to be present, or will the tests create the DB / tables they need?

I appreciate your patience and support with this.

@t2
Copy link

t2 commented Jul 22, 2025

ActiveRecord::ConnectionNotEstablished:
connection to server at "127.0.0.1", port 5431 failed: fe_sendauth: no password supplied

I'm not extremely familiar with Docker so I don't think I could help there. If you just install Postgress.app on your main computer I expect it should just work.

@oxaroky02
Copy link
Author

@t2, no worries, I'll keep looking. I'm limited with what I can install on my work laptop. The DB is up and running and I'm able to connect to it etc. So now I'm trying to figure out where exactly the password can be specified for the tests to run.

Were you able to try the new provider tho? Did it work for you? I'll look into your other question soon, I'm still figuring out how the innards of ruby_llm work in terms of the config and how to support this as a default provider.

@oxaroky02
Copy link
Author

Looks like I can set the default for pg gem via a .env file and that at least gets me past the missing password error. Progress! 😀

@t2
Copy link

t2 commented Jul 22, 2025

config.azure_openai_api_base = ENV.fetch('AZURE_OPENAI_URI')
      config.azure_openai_api_key = ENV.fetch('AZURE_OPENAI_API_KEY', nil)
      config.azure_openai_api_version = ENV.fetch('AZURE_OPENAI_API_VER', nil)
      config.default_model = ENV.fetch('AZURE_OPENAI_MODEL') || 'gpt-4o'

I just checked out your branch and tested and IT WORKED 🤘🏼! Once you get past the tests this will be great for the folks on Azure OpenAI.

@oxaroky02
Copy link
Author

oxaroky02 commented Jul 22, 2025

DELETING

(I figured out the problem. There was something going on with the default zsh environment when I tried to pre-load the Azure Open AI credentials as environment variable. With a clean clone of the fork and all the tests are running fine. Aaargh. Sorry for all the noise here.)

@oxaroky02 oxaroky02 force-pushed the add_azure_openai_provider branch from 07ffcf8 to 9b98721 Compare July 22, 2025 15:49
@oxaroky02
Copy link
Author

oxaroky02 commented Jul 22, 2025

The tests are running properly now. I have added models listing support now and I've updated lib/tasks/models_update.rake to include a way to configure Azure Open AI.

@crmne, when I run the rake task it ends up making a lot of changes to the models.json file, way more than the one model for the Azure OpenAI provider. This in turn leads to tests failing because there's stuff in the models.json without matching vcr cassettes anymore.

One thing I can do (which I tried for fun) is take the new entry and stuff it into the original models.json file. This is in effect what I expected the task to do but since I don't have API keys for other providers the list from Parsera is getting merged in to produce a list that has a lot of changes. Doing it this way allows me to create the test cassette, etc.

BUT this kinda goes against the constraint that I shouldn't modify the file manually.

How would you like me to proceed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new provider New provider integration
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Azure OpenAI support
3 participants