Skip to content

.Net: Fixed warning in release pipeline about Docker base image in examples #6340

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 34 additions & 4 deletions dotnet/samples/Demos/QualityCheck/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
This sample provides a practical demonstration how to perform quality check on LLM results for such tasks as text summarization and translation with Semantic Kernel Filters.

Metrics used in this example:

- [BERTScore](https://github.com/Tiiiger/bert_score) - leverages the pre-trained contextual embeddings from BERT and matches words in candidate and reference sentences by cosine similarity.
- [BLEU](https://en.wikipedia.org/wiki/BLEU) (BiLingual Evaluation Understudy) - evaluates the quality of text which has been machine-translated from one natural language to another.
- [METEOR](https://en.wikipedia.org/wiki/METEOR) (Metric for Evaluation of Translation with Explicit ORdering) - evaluates the similarity between the generated summary and the reference summary, taking into account grammar and semantics.
Expand All @@ -14,7 +15,7 @@ In this example, SK Filters call dedicated [server](./python-server/) which is r

## Prerequisites

1. [Python 3.12](https://www.python.org/downloads/)
1. [Python 3.12](https://www.python.org/downloads/)
2. Get [Hugging Face API token](https://huggingface.co/docs/api-inference/en/quicktour#get-your-api-token).
3. Accept conditions to access [Unbabel/wmt22-cometkiwi-da](https://huggingface.co/Unbabel/wmt22-cometkiwi-da) model on Hugging Face portal.

Expand All @@ -25,29 +26,34 @@ It's possible to run Python server for task evaluation directly or with Docker.
### Run server

1. Open Python server directory:

```bash
cd python-server
```

2. Create and active virtual environment:

```bash
python -m venv venv
source venv/Scripts/activate # activate on Windows
source venv/bin/activate # activate on Unix/MacOS
```

3. Setup Hugging Face API key:

```bash
pip install "huggingface_hub[cli]"
huggingface-cli login --token <your_token>
```

4. Install dependencies:

```bash
pip install -r requirements.txt
```

5. Run server:

```bash
cd app
uvicorn main:app --port 8080 --reload
Expand All @@ -58,18 +64,42 @@ uvicorn main:app --port 8080 --reload
### Run server with Docker

1. Open Python server directory:

```bash
cd python-server
```

2. Create `.env/hf_token.txt` file and put Hugging Face API token in it.
2. Create following `Dockerfile`:

```dockerfile
# syntax=docker/dockerfile:1.2
FROM python:3.12

WORKDIR /code

COPY ./requirements.txt /code/requirements.txt

RUN pip install "huggingface_hub[cli]"
RUN --mount=type=secret,id=hf_token \
huggingface-cli login --token $(cat /run/secrets/hf_token)

RUN pip install cmake
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt

COPY ./app /code/app

CMD ["fastapi", "run", "app/main.py", "--port", "80"]
```

3. Create `.env/hf_token.txt` file and put Hugging Face API token in it.

4. Build image and run container:

3. Build image and run container:
```bash
docker-compose up --build
```

4. Open `http://localhost:8080/docs` and check available endpoints.
5. Open `http://localhost:8080/docs` and check available endpoints.

## Testing

Expand Down
17 changes: 0 additions & 17 deletions dotnet/samples/Demos/QualityCheck/python-server/Dockerfile

This file was deleted.

Loading