Skip to content

satishsnv/opensourcellm-containerizing-using-bentoml

Repository files navigation

opensourcellm-containerizing-using-bentoml

To run the project locally install the requirements

run pip install -r requirements.txt

after installation to save the model to bento store

run python3 save_model.py

This will download and export the model to bento store.

if you want to use a different model, update constants.py with the model details. Make sure model under betofile.yaml and MODEL_TAG are same.

After finishing the above steps, you can run the service locally or via docker.

To run locally

run bentoml serve

server runs at 3000, you can change the port by modifying the port configured in service.py

To create docker

run bentoml build

it will provide the command for dockerizing the service, on running the command docker image will be generated.

There is a test.py file available to test the service. open a new terminal r

run python3 test.py

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages