Skip to content

transformerlab/transformerlab-api

Repository files navigation


transformer lab logo

Transformer Lab API

API for Transformer Lab App.
Explore the docs »

Pytest

API for Transformer Lab

This is the API for the Transformer Lab App which is the main repo for this project. Please go the Transformer Lab App repository to learn more and access documentation.

Use the instructions below if you are installing and running the API on a server, manually.

Requirements

  • An NVIDIA/AMD GPU + Linux or Windows with WSL2 support
  • or MacOS with Apple Silicon
  • If you do not have a GPU, the API will run but will only be able to do inference, but not things like training

Automatic Installation

You can use the install script to get the application running:

./install.sh

This will install mamba if it's not installed, and then use conda and uv pip to install the rest of the application requirements. (The installer only uses mamba/conda to install Python and CUDA drivers, all Python dependencies are installed using uv)

Manual Installation

If you prefer to install the API without using the install script you can follow the steps on this page:

https://transformerlab.ai/docs/install/advanced-install

Run

Once conda and dependencies are installed, run the following:

./run.sh

Developers:

Updating Requirements

Dependencies are managed with uv (installed separately). Add new requirements to requirements.in and to requirements-rocm.in (if you want to enable support for AMD GPUs as well) and regenerate their corresponding requirements-uv.txt variations by running the following commands:

# GPU enabled requirements for CUDA
uv pip compile requirements.in -o requirements-uv.txt --index=https://download.pytorch.org/whl/cu128
sed -i 's/\+cu128//g' requirements-uv.txt

# GPU enabled requirements for ROCm
uv pip compile requirements-rocm.in -o requirements-rocm-uv.txt --index=https://download.pytorch.org/whl/rocm6.3
sed -i 's/\+rocm6\.3//g' requirements-rocm-uv.txt

# On a Linux or Windows (non-Mac) system without GPU support (CPU only), run:
uv pip compile requirements.in -o requirements-no-gpu-uv.txt --index=https://download.pytorch.org/whl/cpu
sed -i 's/\+cpu//g' requirements-no-gpu-uv.txt

# On a MacOS system (Apple Silicon), run:
uv pip compile requirements.in -o requirements-no-gpu-uv.txt

NOTES:

  1. If the command that generates requirements-rocm-uv.txt adds the nvidia-ml-py library then you should remove that.

  2. the sed commands are to remove the suffixes on pytorch libraries that get added but break the install

Windows Notes

https://transformerlab.ai/docs/install/install-on-windows

About

API Server for Transformer Lab

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 19

Languages