Skip to content

Conversation

praneeth999
Copy link

Add support for Inference Providers and OpenRouter using the OpenAI-compatible API format to LangExtract as additional cloud-based LLM options, alongside the recently added OpenAI-backed features.

@praneeth999 praneeth999 closed this Aug 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant