Skip to content

feat: handle o1 when used with copilot provider #235

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 8, 2025

Conversation

tarruda
Copy link
Contributor

@tarruda tarruda commented Dec 19, 2024

The o1-preview and o1-mini models can be used with copilot provider when you have a copilot pro subscription. This ensures the request/response are handled in those cases (currently only openai provider is handled).

tarruda and others added 2 commits December 19, 2024 14:33
The o1-preview and o1-mini models can be used with copilot provider when
you have a copilot pro subscription. This ensures the request/response
are handled in those cases (currently only openai provider is handled).
@Robitx Robitx merged commit 814f7ba into Robitx:main Apr 8, 2025
1 check passed
dcai pushed a commit to dcai/gp.nvim that referenced this pull request Apr 23, 2025
The o1-preview and o1-mini models can be used with copilot provider when
you have a copilot pro subscription. This ensures the request/response
are handled in those cases (currently only openai provider is handled).

Co-authored-by: Tibor Schmidt <robitx@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants