how to use litellm pipelines #50
-
I have been using litellm to call some of the interfaces I have built, but now I have changed to pipelines. I can't get it to work, so I'm looking for guidance. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 3 replies
-
For a basic setup that will support Anthropic and Cohere APIs as an example: docker run -d \
--name pipelines \
-p 9099:9099 \
-e ANTHROPIC_API_KEY=sk-ant-12345 \
-e COHERE_API_KEY=12345 \
-e PIPELINES_URLS="https://raw.githubusercontent.com/open-webui/pipelines/main/examples/pipelines/providers/anthropic_manifold_pipeline.py;https://raw.githubusercontent.com/open-webui/pipelines/main/examples/pipelines/providers/cohere_manifold_pipeline.py \
--restart always \
ghcr.io/open-webui/pipelines:main |
Beta Was this translation helpful? Give feedback.
-
Here's a guide to help you move your
Your docker cp open-webui:/app/backend/data/litellm/config.yaml . This will copy the
To run docker run -d \
-p 4000:4000 \
--name litellm \
-v ./config.yaml:/app/config.yaml \
-e LITELLM_MASTER_KEY=sk-12345 \
--restart always \
ghcr.io/berriai/litellm:main-latest \
--config /app/config.yaml --port 4000 Replace
Once you have the By following these steps, you'll have moved your |
Beta Was this translation helpful? Give feedback.
-
Thanks for the instruction . I used https://raw.githubusercontent.com/open-webui/pipelines/main/examples/pipelines/providers/anthropic_manifold_pipeline.py and my Anthropic API to setup Claude 3. Works for text message. But if I upload a file, I got a Error 422 Unprocessable Entity error. Is this the limitation of LiteLLM or the example python code? |
Beta Was this translation helpful? Give feedback.
Here's a guide to help you move your
litellm
setup out of theopen-webui
container and into its own separate container. This will allow you to uselitellm
independently and connect it toopen-webui
as needed.config.yaml
fileYour
config.yaml
file should be located in theopen-webui
container. To find it, use the following command:docker cp open-webui:/app/backend/data/litellm/config.yaml .
This will copy the
config.yaml
file from theopen-webui
container to your current directory.litellm
containerTo run
litellm
in its own container, use the followingdocker run
command: