Use pytest-rerunfailures
to re-run flaky Mermaid tests
#5236
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Motivation
#5232 (make Mermaid tests not flaky) harder than it looks. We may have to live with flakiness. This PR does exactly that.
Implementation
This pull request introduces improvements to the test suite's reliability by addressing flaky tests related to Mermaid rendering. The main changes involve adding the
pytest-rerunfailures
plugin and marking relevant tests to automatically rerun on failure, reducing false negatives in CI.Test reliability improvements:
pytest-rerunfailures
as a dependency in bothpyproject.toml
andtests/requirements.txt
to enable automatic rerunning of flaky tests. [1] [2]tests/test_markdown.py
with@pytest.mark.flaky(reruns=2)
to allow up to two reruns on failure. [1] [2]pytest.mark.flaky(reruns=2)
marker to all tests intests/test_mermaid.py
for consistent rerun behavior.pytest
in relevant test files to support the new flaky test markers.Progress