Skip to content
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 0 additions & 3 deletions .flake8

This file was deleted.

35 changes: 31 additions & 4 deletions .github/workflows/test-python.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,25 +5,52 @@ on:
push:
branches: [main]

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
Comment on lines +8 to +10
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IIRC, we discussed before that this part is not needed given that we don't have limitations on concurrent runs due to GitHub enterprise.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A use case is re-pushing the branch for the PR while the workflow is running. Even with no limitations, is that useful to keep the existing run going?

Copy link
Member

@andreyvelich andreyvelich Aug 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be useful to compare results, but I am fine to cancel them for now.
@kramaranya Please can you apply the change to all GitHub action tests in that case ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, updated in 4853da4


jobs:
test:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ['3.9', '3.11']
dependency-group:
- {name: dev, make-target: test-python-dev}
- {name: prod, make-target: test-python}

name: Test ${{ matrix.dependency-group.name }} (Python ${{ matrix.python-version }})

steps:
- uses: actions/checkout@v4

- name: Set up Python
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: '3.9'
python-version: ${{ matrix.python-version }}

- name: Verify
run: make verify

- name: Run Tests
run: |
make test-python report=xml
make ${{ matrix.dependency-group.make-target }} report=xml

- name: Upload coverage to Coveralls
uses: coverallsapp/github-action@v2
with:
github-token: ${{ secrets.GITHUB_TOKEN }} # Pass the GITHUB_TOKEN
github-token: ${{ secrets.GITHUB_TOKEN }}
flag-name: ${{ matrix.dependency-group.name }}-py${{ matrix.python-version }}
parallel: true

coverage:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't we run coverage as part of the previous job in Upload coverage to Coveralls step ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As I can see, it shows the report on the PRs: #56 (comment)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I added the parallel + “coverage-finished” step there when the matrix had four legs (dev/prod × two Python versions), so Coveralls could wait for every flag upload before closing the build and merging reports -- https://docs.coveralls.io/parallel-builds

But I've removed it in 189139d since we're now testing against the master modules only

if: ${{ always() }}
needs: test
runs-on: ubuntu-latest
steps:
- name: Coveralls Finished
uses: coverallsapp/github-action@v2
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
parallel-finished: true
8 changes: 8 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,14 @@ To run the unit tests (if present), execute:
pytest
```

#### Development Testing
For development and before submitting PRs, run:
```sh
make test-python-dev
```

This tests against latest master branch dependencies to catch integration issues early.

### Code Coverage
To run tests and measure coverage:
```sh
Expand Down
13 changes: 12 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -67,9 +67,20 @@ uv-venv:
fi

# make test-unit will produce html coverage by default. Run with `make test-unit report=xml` to produce xml report.
.PHONY: test-python-dev
test-python-dev: uv-venv
@uv pip install --project $(PY_DIR) --group dev "$(PY_DIR)[test]"
@uv run coverage run --source=kubeflow.trainer.api.trainer_client,kubeflow.trainer.utils.utils -m pytest ./python/kubeflow/trainer/api/trainer_client_test.py
@uv run coverage report -m kubeflow/trainer/api/trainer_client.py kubeflow/trainer/utils/utils.py
ifeq ($(report),xml)
@uv run coverage xml
else
@uv run coverage html
endif
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we run tests over the master modules for now ?
Since we can't introduce any breaking changes to the Kubernetes API, I don't see a need to run tests against released modules.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we can't introduce any breaking changes to the Kubernetes API

Hmm, why can't we?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to Kubernetes API best practices, if we introduce any breaking changes to the existing API: v1alpha1, we have to create the newer API version, for instance: v1alpha2: https://kubernetes.io/docs/reference/using-api/deprecation-policy/

cc @tenzen-y @astefanutti

In the future, we should discuss how to design our SDK to make it compatible with supported API versions, but we are not there yet.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for explanation! That makes sense now to test it only against master branch
Updated in 4d8967f


.PHONY: test-python
test-python: uv-venv
@uv pip install "./python[test]"
@uv pip install --project $(PY_DIR) "$(PY_DIR)[test]"
@uv run coverage run --source=kubeflow.trainer.api.trainer_client,kubeflow.trainer.utils.utils -m pytest ./python/kubeflow/trainer/api/trainer_client_test.py
@uv run coverage report -m kubeflow/trainer/api/trainer_client.py kubeflow/trainer/utils/utils.py
ifeq ($(report),xml)
Expand Down
2 changes: 1 addition & 1 deletion python/kubeflow/trainer/api/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# flake8: noqa
# ruff: noqa

# import apis into api package

1 change: 1 addition & 0 deletions python/pyproject.toml
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's also migrate to the official package as part of this PR: https://pypi.org/project/kubeflow-trainer-api/

Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ ignore = ["E203"]
[dependency-groups]
dev = [
"ruff>=0.12.2",
"kubeflow_trainer_api@git+https://github.com/kubeflow/trainer.git@master#subdirectory=api/python_api",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any use-case when users would like to install Kubeflow SDK with the latest module package ?
IIUC, the dependency-groups is never published to PyPI by hatch, so users can't install.
Thoughts @astefanutti @szaher @briangallagher @kramaranya @eoinfennessy ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I've been thinking about adding this to project.optional-dependencies as well, but since we don't update API models that often, it makes sense to me to keep models from master available only for testing. However, I'm open to other suggestions if you see it differently

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my point of view, since we allow to install control plane with the latest changes, users might want to experiment with it during the development: https://www.kubeflow.org/docs/components/trainer/operator-guides/installation/#installing-the-kubeflow-trainer-controller-manager

This gives opportunities for them to quickly adopt newer API changes into their systems.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, let me update it

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated in 4d8967f

]

[tool.uv]
Expand Down
6 changes: 5 additions & 1 deletion python/uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.