Skip to content

Support Ollama APIs for model management #1532

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

ThomasVitale
Copy link
Contributor

  • Extend the OllamaApi to support listing, copying, deleting, and pulling models programmatically.
  • Improve setup for integration testing with Ollama and Testcontainers.

Enables gh-526

@ThomasVitale
Copy link
Contributor Author

Once this PR is merged, we should be able to remove the disabling logic from the Ollama integration tests since they are now all sharing the same Ollama container instance and re-using the same 3 (small) models across all tests (the only exception is the test in the Spring Boot Testcontainers module). It means the execution will be relatively fast, even on the GitHub Actions Runners.

* Extend the OllamaApi to support listing, copying, deleting, and pulling models programmatically.
* Improve setup for integration testing with Ollama and Testcontainers.

Enables spring-projectsgh-526

Signed-off-by: Thomas Vitale <ThomasVitale@users.noreply.github.com>
@markpollack markpollack self-assigned this Oct 14, 2024
@markpollack markpollack added this to the 1.0.0-M4 milestone Oct 14, 2024
var promptWithMessageHistory = new Prompt(List.of(new UserMessage("Dummy"), response.getResult().getOutput(),
new UserMessage("Repeat the last assistant message.")));
var promptWithMessageHistory = new Prompt(List.of(new UserMessage("Hello"), response.getResult().getOutput(),
new UserMessage("Tell me just the names of those pirates.")));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for this, clever.

Copy link
Contributor

@eddumelendez eddumelendez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use ollamaContainer.getEndpoint()

public static OllamaApi buildOllamaApiWithModel(String model) {
var baseUrl = "http://localhost:11434";
if (useTestcontainers) {
baseUrl = "http://" + ollamaContainer.getHost() + ":" + ollamaContainer.getMappedPort(11434);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
baseUrl = "http://" + ollamaContainer.getHost() + ":" + ollamaContainer.getMappedPort(11434);
baseUrl = ollamaContainer.getEndpoint();

public static String buildConnectionWithModel(String model) throws IOException, InterruptedException {
var baseUrl = "http://localhost:11434";
if (useTestcontainers) {
baseUrl = "http://" + ollamaContainer.getHost() + ":" + ollamaContainer.getMappedPort(11434);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
baseUrl = "http://" + ollamaContainer.getHost() + ":" + ollamaContainer.getMappedPort(11434);
baseUrl = ollamaContainer.getEndpoint();

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I had missed that! I opened a new PR with that improvement: #1538

@markpollack
Copy link
Member

alot of great cleanup, took 4 min to run on my machine. thanks.

@ThomasVitale
Copy link
Contributor Author

ThomasVitale commented Oct 15, 2024

@markpollack thank you! I opened a followup PR to include the improvement suggested by @eddumelendez and because I forgot the test switch on, so the PR disables the integration tests again: #1538 Sorry about that!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants