Skip to content
Discussion options

You must be logged in to vote

Hi @Kubex212 models like mistral-ocr-2503 deployed via dedicated serverless endpoints aren't yet part of the unified endpoint ecosystem, you can still integrate them into Prompt Flow with a bit of orchestration.

Here are two practical ways to connect the output of this serverless endpoint to downstream models:

Custom Tool in Prompt Flow
You can wrap the OCR endpoint call into a custom Python tool and include it in your Prompt Flow pipeline. Here's how:

  • Create a Python module (tool) that sends a request to the mistral-ocr-2503 endpoint using the requests library.
  • Extract the text response from the model.
  • Pass that text as an input to a downstream node (e.g., a prompt for a GPT model).

Pr…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by Kubex212
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment