-
I deployed mistral-ocr-2503 model as a serverless endpoint since: "Dedicated serverless endpoint: This model will be deployed as a dedicated serverless endpoint—it's not yet available for unified endpoint deployment." I wonder if it is possible to use this endpoint in prompt flow or is there any other way to use this model's output as another model's input other than just calling the models manually on my side? Thanks, Jakub. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @Kubex212 models like Here are two practical ways to connect the output of this serverless endpoint to downstream models: Custom Tool in Prompt Flow
Prompt Flow's ability to use custom components means you can easily chain models—even across endpoint types. Chained Orchestration with Azure Functions
This keeps everything modular and enables better reuse and deployment flexibility. For official guidance and deployment details on the |
Beta Was this translation helpful? Give feedback.
Hi @Kubex212 models like
mistral-ocr-2503
deployed via dedicated serverless endpoints aren't yet part of the unified endpoint ecosystem, you can still integrate them into Prompt Flow with a bit of orchestration.Here are two practical ways to connect the output of this serverless endpoint to downstream models:
Custom Tool in Prompt Flow
You can wrap the OCR endpoint call into a custom Python tool and include it in your Prompt Flow pipeline. Here's how:
mistral-ocr-2503
endpoint using therequests
library.Pr…