Skip to content

Rework Chapter 5 intro and Python testing slides #212

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Jan 14, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions 05_testing_and_ci/examples/python_testing/operations.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,10 +54,10 @@ def main():
data = [5, 3, 14, 27, 4, 9]

maximum = find_max(data)
print("Maximum = {}".format(maximum))
print("Maximum of {} is {}".format(data, maximum))

mean = find_mean(data)
print("Average = {}".format(mean))
print("Average of {} is {}".format(data, mean))


if __name__ == "__main__":
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
Tests for mathematical operations functions.
"""
from operations import find_max, find_mean
import unittest
from unittest import TestCase
import csv

Expand Down Expand Up @@ -78,3 +79,7 @@ def test_regression_mean(self):

# Test
self.assertAlmostEqual(actual_mean, reference_mean[0], 2)

if __name__ == "__main__":
# Run the tests
unittest.main()
7 changes: 0 additions & 7 deletions 05_testing_and_ci/examples/python_testing/tox.ini

This file was deleted.

7 changes: 7 additions & 0 deletions 05_testing_and_ci/examples/python_testing/tox.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
requires = ["tox>=4"]
env_list = ["testing"]

[env.testing]
description = "Run pytest"
deps = ["pytest>=8"]
commands = [["pytest"]]
36 changes: 18 additions & 18 deletions 05_testing_and_ci/intro_slides.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,12 @@ slideOptions:

## Learning Goals of the Chapter

- Justify the effort of developing testing infrastructure for simulation software
- Discern the concepts of unit testing, integration testing and regression testing with the perspective of simulation software
- Work with the Python testing frameworks `pytest` and `unittest`
- Explain why developing tests is crucial.
- Explain the concepts of unit testing, integration testing and regression testing with the perspective of simulation software.
- Write tests using the Python libraries `pytest` and `unittest`.
- Write tests in C++ using `Boost.Test`.
- Explain the concepts of automation workflows in RSE.
- Write automation workflows using GitHub Actions and GitLab CI.

---

Expand All @@ -48,7 +51,7 @@ slideOptions:
- Improve software reliability and reproducibility.
- Make sure that changes (bugfixes, new features) do not affect other parts of software.
- Generally all software is better off being tested regularly. Possible exceptions are very small codes with single users.
- Ensure that a released version of a software actually works.
- Ensure that a distributed/packaged software actually works.

---

Expand Down Expand Up @@ -93,7 +96,7 @@ assert condition, "message"
- A *unit* is a single function in one situation.
- A situation is one amongst many possible variations of input parameters.
- User creates the expected result manually.
- Fixture is the set of inputs used to generate an actual result.
- A fixture is a set of inputs used to generate an actual result.
- Actual result is compared to the expected result, for e.g. using an assertion statement.

---
Expand All @@ -120,9 +123,9 @@ assert condition, "message"

## Test Coverage

- Coverage is the amount of code a test touches in one run.
- Coverage is the amount of code a test runs through.
- Aim for high test coverage.
- There is a trade-off: high test coverage vs. effort in test development
- There is a trade-off: extremely high test coverage vs. effort in test development

---

Expand All @@ -138,20 +141,17 @@ assert condition, "message"

## Test-driven Development (TDD)

- Principle is to write a test and then write a code to fulfill the test.
- Principle of writing a test and then write a code to fulfill the test.
- Advantages:
- In the end user ends up with a test alongside the code.
- Leads to a robust test along with the implementation.
- Eliminates confirmation bias of the user.
- Writing tests gives clarity on what the code is supposed to do.
- Disadvantage: known to not improve productivity.
- Facilitates continuous integration.
- Reduces need for debugging.
- Disadvantages:
- False security from tests.
- Neglect of overall design.

---

## Checking-driven Development (CDD)

- Developer performs spot checks; sanity checks at intermediate stages
- Simulation software often has heuristics which are easy to determine.
- Keep performing same checks at different stages of development to ensure the code works.
Source: https://en.wikipedia.org/wiki/Test-driven_development

---

Expand Down
42 changes: 21 additions & 21 deletions 05_testing_and_ci/python_testing_demo.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,20 +24,20 @@ pip install -U pytest
- All tests can be run using the command-line tool called `pytest`. Just type `pytest` in the working directory and hit ENTER.
- If pytest is installed in some other way, you might need to run it like `python -m pytest`.
- One test is expected to fail. Reading the error message we understand that the failure occurs because floating-point variable comparison is not handled correctly.
- We need to tell pytest that while comparing two floating-point variables the value needs to be correct only up to a certain tolerance limit. To do this the expected mean value needs to be changed by uncommenting the line in the following part of the code:
- We need to tell pytest that while comparing two floating-point variables the value needs to be correct only up to a certain tolerance limit. To do this, the expected mean value needs to be changed by uncommenting the line in the following part of the code:

```python
# Expected result
expected_mean = 78.33
# expected_result = pytest.approx(78.3, abs=0.01)
```

- **Comparing floating point variables** needs to be handled in functions like `find_average` and is done using `pytest.approx(value, abs)`. The `abs` value is the tolerance up to which the floating-point value will be checked, that is `78.33 +/- 0.01`.
- **Comparing floating point variables** needs to be handled in functions like `find_mean` and is done using `pytest.approx(value, abs)`. The `abs` value is the tolerance up to which the floating-point value will be checked, that is `78.33 +/- 0.01`.
- Even if one test fails, pytest runs all the tests and gives a report on the failing test. The assertion failure report generated my pytest is also more detailed than the usual Python assertion report. When the test fails, the following is observed:

```bash
=============================================== FAILURES ====================================================
____________________________________________ test_find_mean _________________________________________________
========================================== FAILURES ===============================================
_______________________________________ test_find_mean ____________________________________________

def test_find_mean():
"""
Expand Down Expand Up @@ -73,20 +73,21 @@ tests/
```

- Putting the tests in a folder `tests/` does not affect the behavior of pytest. When pytest is run from the original directory, the tests are found and run.
- **Note**: revert to the old directory structure before proceeding to the next section.

## unittest

- Base class `unittest.TestCase` is used to create a test suite consisting of all the tests of a software.
- Base class `unittest.TestCase` is used to create a test suite.
- Each test is now a function of a class which is derived from the class `unittest.TestCase`.
- The same tests as for `pytest` are implemented using `unittest` in the file `test_operations_unittests.py`. The tests are functions of a class named `TestOperations` which tests our mathematical operations. The class `TestOperations` is derived from `unittest.TestCase`.
- unittest can be run by:
- unittest can be run as a Python module:

```bash
python -m unittest
```

- unittest.TestCase offers functions like `assertEqual`, `assertAlmostEqual`, `assertTrue`, etc. for use instead of the usual assertion statements. These statements help the test runner to accumulate all test results and generate a test report.
- `unittest.main()` provides an option to run the tests from a command-line interface.
- unittest.TestCase offers functions like `assertEqual`, `assertAlmostEqual`, `assertTrue`, and more ([see unittest.TestCase documentation](https://docs.python.org/3/library/unittest.html#unittest.TestCase)) for use instead of the usual assertion statements. These statements ensure that test runner to accumulate all test results and generate a test report.
- `unittest.main()` provides an option to run the tests from a command-line interface and also from a file.
- `setUp` function is executed before all the tests. Similar a clean up function `tearDown` exists.
- The intention is to group together sets of similar tests in an instant of `unittest.TestCase` and have multiple such instances.
- Decorators such as `@unittest.skip`, `@unittest.skipIf`, `@unittest.expectedFailure` can be used to gain flexibility over working of tests.
Expand Down Expand Up @@ -123,22 +124,21 @@ coverage html

## tox

- Automation for Python testing (and much more)
- Virtual environments are created for each task, and tox takes care of installing dependencies and the package itself inside of the environment.
- Order of preference for files that tox tries to read: `pyproject.toml`, `tox.ini`, `setup.cfg`
- `tox.ini` file
- Environment orchestrator to setup and execute various tools for a project.
- `tox` creates virtual environments to run each tools in.
- `tox.toml` file:

```ini
[tox]
envlist = my_env
skipsdist = true
```toml
requires = ["tox>=4"]
env_list = ["testing"]

[testenv]
deps = pytest
commands = pytest
[env.testing]
description = "Run pytest"
deps = ["pytest>=8"]
commands = [["pytest"]]
```

- Global settings defined under section `[tox]` in the INI file.
- Start tox by running the command `tox` in the directory where the `tox.ini` exists.
- Global settings defined under section at the top of the `tox.toml` file.
- Start tox by running the command `tox` in the directory where the `tox.toml` exists.
- tox takes more time the first time it is run as it creates the necessary virtual environments. Virtual environment setup can be found in the `.tox` repository.
- Observe that tox starts a virtual environment, installs the dependency (here `pytest`) and runs `pytest`.
32 changes: 15 additions & 17 deletions 05_testing_and_ci/python_testing_exercise.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,27 +2,25 @@

## Starting Remarks

- [Exercise repository link](https://github.com/Simulation-Software-Engineering/testing-python-exercise-wt2223)
- Deadline for submitting this exercise is **Thursday 26th January 2023 09:00**.
- Structure all the tests in a format similar to what is shown in the [demo code](https://github.com/Simulation-Software-Engineering/Lecture-Material/blob/main/05_testing_and_ci/examples/python_testing).
- [Exercise repository link](https://github.com/Simulation-Software-Engineering/testing-python-exercise-wt2425)
- Deadline for submitting this exercise is **Wednesday 22nd January 2025 09:00**.

## Prerequisites

- An operating system / software / environment where you can install Python and some basic Python tools
- An editor or IDE to edit Python code and write text files
- The following Python tools:
- Python (version >= 3)
- [pip](https://pypi.org/project/pip/)
- [NumPy](https://numpy.org/)
- [Matplotlib](https://matplotlib.org/)
- [pytest](https://docs.pytest.org/en/6.2.x/getting-started.html#install-pytest)
- [unittest](https://docs.python.org/3/library/unittest.html#module-unittest)
- [coverage](https://coverage.readthedocs.io/en/6.2/#quick-start)
- [tox](https://tox.wiki/en/4.0.15/installation.html)
- Python (version >= 3)
- [pip](https://pypi.org/project/pip/)
- [NumPy](https://numpy.org/)
- [Matplotlib](https://matplotlib.org/)
- [pytest](https://docs.pytest.org/en/6.2.x/getting-started.html#install-pytest)
- [unittest](https://docs.python.org/3/library/unittest.html#module-unittest)
- [coverage](https://coverage.readthedocs.io/en/6.2/#quick-start)
- [tox](https://tox.wiki/en/4.23.2/installation.html)

## Step 1 - Getting Familiar With the Code

- Fork the [repository](https://github.com/Simulation-Software-Engineering/testing-python-exercise-wt2223).
- Fork the [repository](https://github.com/Simulation-Software-Engineering/testing-python-exercise-wt2425).
- The code in `diffusion2d.py` is in principle the same code used for the Python packaging exercise. The main difference is that now the code has a class `SolveDiffusion2D` which has several member functions.
- Each function name states what the function does, for example the function `initialize_domain()` takes in input arguments `w` (width), `h` (height), `dx` and `dy` and sets the values to member variables of the class and also calculates the number of points in x and y directions.
- The functions `initialize_domain` and `initialize_physical_parameters` have default values for all input parameters, hence they can be called without any parameters.
Expand Down Expand Up @@ -51,7 +49,7 @@
- Note that you have the object of the class `SolveDiffusion2D` and hence you can access member variables, for example `solver.nx` and `solver.ny`. This is useful to check the actual values.
- Using a similar workflow, complete the other two unit tests.
- Run the tests using `pytest`.
- It is observed that in some instances `pytest` is not able to find the tests. One reason is the way pytest is installed, which is typically either using `pip` or `apt`. Refer to the [corresponding section](https://github.com/Simulation-Software-Engineering/Lecture-Material/blob/main/05_testing_and_ci/python_testing_demo.md#pytest) in the demo for more details. If such errors occur, then try to explicitly point pytest to the relevant test file. For example:
- It is observed that in some instances `pytest` is not able to find the tests. One reason is the way pytest is installed, which is typically either using `pip` or `apt`. Refer to the [corresponding section](https://github.com/Simulation-Software-Engineering/Lecture-Material/blob/main/05_testing_and_ci/python_testing_demo.md#pytest) in the demo for more details. If such errors occur, then try to explicitly point pytest to the relevant test file in the following way:

```bash
pytest tests/unit/test_diffusion2d_functions.py
Expand Down Expand Up @@ -99,13 +97,13 @@ pytest tests/unit/test_diffusion2d_functions.py

- Using the coverage tool generate a HTML report of the code coverage of all the tests.
- Open the report file in a browser and print the report to a file called `coverage-report.pdf`. Add this file to the repository.
- **Note**: coverage can be used with both `pytest` and `unittest`. In this case generating the report of the unit tests using unittest is sufficient.
- **Note**: coverage can be used with both `pytest` and `unittest`. In this case, generating the report of the unit tests using unittest is sufficient.

## Step 7 - Automation Using tox

- Write a `tox.ini` file such that by running the command `tox`, both `pytest` and `unittest` are executed.
- Write a `tox.toml` file such that by running the command `tox`, both `pytest` and `unittest` are executed.
- Use the `requirements.txt` file to send all the dependencies information to tox.

## Step 8 - Submission

- Open a pull request titled `Adding tests by <GitLab username>` from your fork to the main repository.
- Open a pull request titled `[<your GitLab username>] Adding tests` (for example: `[desaiin] Adding tests`) from your fork to the main branch of the exercise repository.
22 changes: 15 additions & 7 deletions 05_testing_and_ci/python_testing_slides.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ slideOptions:

- Python framework specifically designed to run, monitor and automate unit tests.
- Many features like test automation, sharing of setup and shutdown of tests, etc.
- Using the base class `unittest.TestCase` to create a test suite.
- Use the base class `unittest.TestCase` to create a test suite.
- Command-line interface: `python -m unittest test_module1 test_module2 ...`.

---
Expand All @@ -69,19 +69,27 @@ slideOptions:

## tox

- Automation for Python testing, building and distribution
- Creates virtual environments for each process
- Depending on the command, dependencies are installed, tests are run, packaging is done, etc.
- tox command line tool reads and runs files like `pyproject.toml`, `tox.ini`, and `setup.cfg`
- More information in the [tox wiki](https://tox.wiki/en/4.0.15/index.html).
- Environment orchestrator to setup and execute various tools for a project.
- Creates virtual environments for each process.
- Processes include testing, linting, building, documentation generation, and more.
- Configuration via `tox.toml` or `tox.ini` file.

---

## tox Demo

---

## Further reading
## Other Testing Frameworks

- [nose](https://pypi.org/project/nose2/) is an extension to `unittest` with added plugins.
- [testify](https://pypi.org/project/testify/) based on unittest and nose with additional features.
- [robotframework](https://pypi.org/project/robotframework/) is a generic automation framework.

---

## Further Reading

- [pytest documentation](https://docs.pytest.org/en/6.2.x/)
- [unittest documentation](https://docs.python.org/3/library/unittest.html)
- [tox user guide](https://tox.wiki/en/4.23.2/user_guide.html#user-guide)
Loading