-
-
Notifications
You must be signed in to change notification settings - Fork 844
Pytest: wait for Mermaid to render #5232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR addresses test flakiness in Mermaid diagram rendering by adding explicit wait times to ensure asynchronous rendering completes before assertions are performed.
- Adds
screen.wait(1.0)
calls after Mermaid content updates to prevent race conditions - Improves test stability for CI/CD pipelines by allowing sufficient rendering time
- Targets specific points where Mermaid diagrams are dynamically updated or initially loaded
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.
File | Description |
---|---|
tests/test_mermaid.py | Added wait calls after initial page load and content update to ensure Mermaid rendering completes |
tests/test_markdown.py | Added wait call after Mermaid content update to prevent assertion failures on slow rendering |
Thanks for looking into this flaky test, @evnchn! It also caught my attention a few days ago. But I think your new delays are irrelevant to the specific problem. If the test fails, it also happens in test_markdown.py in assert screen.selenium.find_element(By.XPATH, '//span[p[contains(text(), "Node_A")]]').is_displayed() That's above the new delay this PR is adding. Before this line there is already a delay which I recently increased to 1 second in 3b2d6a0. Unfortunately this didn't help. So it remains unclear how to fix this problem. We could further increase the duration, but this adds to the overall time it takes to run all tests. Maybe calling |
Indeed the waits, as inserted, do not fix the issue. Because the issue surfaces even with the waits just like how I inserted it. My apologies for the 🤡.
Note that with #4722 and #4692 we no longer flash-render the Mermaid raw syntax text since we use the Mermaid API directly. I wrote it so I know it. Well, as with pipeline stuff, there's no better way but to try out different and basically fire shots into the dark (tests always pass locally...). It's more "merge it if it doesn't break existing tests, then pray that it isn't flaky". @falkoschindler Two things though, for another PR:
|
https://github.com/zauberzeug/nicegui/actions/runs/18405814901/job/52445352379 ![]() Huh... So Mermaid is broken when we have a failing test... |
Very interesting! This confirms my suspicion that we shouldn't simply repeat tests that don't pass on first try. |
We will have to re-think, if that's the case.
|
@evnchn Do we really need |
@falkoschindler AFAIK |
Motivation
Infuriating to wake up to a red pipeline in my fork. Consider https://github.com/evnchn/nicegui/actions/runs/18337127046
Implementation
This pull request enhances the test coverage and reliability for Mermaid diagram rendering in both
test_markdown.py
andtest_mermaid.py
by adding explicit assertions for node content and display. The changes ensure that key nodes are present or absent as expected after user interactions.Improvements to test assertions for Mermaid diagram rendering:
tests/test_markdown.py
to assert that 'Node_A' and 'Node_C' are present or absent as appropriate after setting new content, and that these nodes are displayed in the rendered output.tests/test_markdown.py
to verify that both 'Node_A' and 'Node_B' are present and displayed.tests/test_mermaid.py
by adding assertions to confirm that 'Node_A' and 'Node_C' are present and have the correct class, and that 'Node_A' is absent after content changes.HUMAN NOTE: By the "checks" I mean
screen.should_contain
which has a built-in retry mechanism. Check #5232 (comment) andnicegui/nicegui/testing/screen.py
Lines 195 to 210 in c5c9dfe
Progress