Skip to content

Plugin support for custom LLM providers now available #99

@aksg87

Description

@aksg87

Hi all,

LangExtract now supports third-party model providers through a new plugin and registry infrastructure. You can integrate custom LLM backends (Azure OpenAI, AWS Bedrock, custom inference servers, etc.) without modifying core LangExtract code.

Please check out the example and documentation:
https://github.com/google/langextract/tree/main/examples/custom_provider_plugin

Feel free to provide feedback or report any issues. Thank you!

@YunfanGoForIt @mariano @zc1175 @praneeth999 @JustStas @jkitaok

Metadata

Metadata

Assignees

No one assigned

    Labels

    alternative-llmSupport for alternative LLM providers and models

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions