Skip to content

DefangSamples/sample-managed-llm-template

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Managed LLM

1-click-deploy

This sample application demonstrates the use of OpenAI-compatible Managed LLMs (Large Language Models) with Defang.

Note: Using Docker Model Provider? See our Managed LLM with Docker Model Provider sample.

Using the Defang OpenAI Access Gateway, the feature x-defang-llm: true enables you to use Managed LLMs on the Defang Playground or on platforms offered by BYOC providers (such as AWS Bedrock or GCP Vertex AI) with an OpenAI-compatible SDK.

This allows switching from OpenAI to the Managed LLMs on supported cloud platforms without modifying your application code.

You can configure the LLM_MODEL and LLM_URL for the LLM separately for local development and production environments.

  • The LLM_MODEL is the LLM Model ID you are using.
  • The LLM_URL is the bridge that provides authenticated access to the LLM model.

Ensure you have enabled model access for the model you intend to use. To do this, you can check your AWS Bedrock model access or GCP Vertex AI model access.

To learn about available LLM models in Defang, please see our Model Mapping documentation.

For more about Managed LLMs in Defang, please see our Managed LLMs documentation.

Defang OpenAI Access Gateway

In the compose.yaml file, the llm service is used to route requests to the LLM API model. This is known as the Defang OpenAI Access Gateway.

The x-defang-llm property on the llm service must be set to true in order to use the OpenAI Access Gateway when deploying with Defang.

Prerequisites

  1. Download Defang CLI
  2. (Optional) If you are using Defang BYOC authenticate with your cloud provider account
  3. (Optional for local development) Docker CLI

Development

To run the application locally, you can use the following command:

docker compose -f compose.local.yaml up --build

Deployment

Note

Download Defang CLI

Defang Playground

Deploy your application to the Defang Playground by opening up your terminal and typing:

defang compose up

BYOC

If you want to deploy to your own cloud account, you can use Defang BYOC.


Title: Managed LLM

Short Description: An app using Managed LLMs with Defang's OpenAI Access Gateway.

Tags: LLM, OpenAI, Python, Bedrock, Vertex

Languages: Python

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •