-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Labels
alternative-llmSupport for alternative LLM providers and modelsSupport for alternative LLM providers and models
Description
Hi all,
LangExtract now supports third-party model providers through a new plugin and registry infrastructure. You can integrate custom LLM backends (Azure OpenAI, AWS Bedrock, custom inference servers, etc.) without modifying core LangExtract code.
Please check out the example and documentation:
https://github.com/google/langextract/tree/main/examples/custom_provider_plugin
Feel free to provide feedback or report any issues. Thank you!
@YunfanGoForIt @mariano @zc1175 @praneeth999 @JustStas @jkitaok
mariano, YunfanGoForIt, phren0logy, farhatnadim, Kishlay-notabot and 4 morematbeedotcom, lijiaxu1996, the-vampiire, mariano and TeTeTang
Metadata
Metadata
Assignees
Labels
alternative-llmSupport for alternative LLM providers and modelsSupport for alternative LLM providers and models