You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Rework Chapter 5 intro and Python testing slides (#212)
* Rework learning goals of Chapter 5, and introduction to testing slides
* Shorten learning goal for testing part
Co-authored-by: Benjamin Uekermann <benjamin.uekermann@ipvs.uni-stuttgart.de>
* Capital case in titles of slides
* Rework Python testing demo and content slides
* Update exercise
* Minor tweaks to exercise
* Reworking Python testing demo content based on a trial
---------
Co-authored-by: Benjamin Uekermann <benjamin.uekermann@ipvs.uni-stuttgart.de>
Copy file name to clipboardExpand all lines: 05_testing_and_ci/python_testing_demo.md
+21-21Lines changed: 21 additions & 21 deletions
Original file line number
Diff line number
Diff line change
@@ -24,20 +24,20 @@ pip install -U pytest
24
24
- All tests can be run using the command-line tool called `pytest`. Just type `pytest` in the working directory and hit ENTER.
25
25
- If pytest is installed in some other way, you might need to run it like `python -m pytest`.
26
26
- One test is expected to fail. Reading the error message we understand that the failure occurs because floating-point variable comparison is not handled correctly.
27
-
- We need to tell pytest that while comparing two floating-point variables the value needs to be correct only up to a certain tolerance limit. To do this the expected mean value needs to be changed by uncommenting the line in the following part of the code:
27
+
- We need to tell pytest that while comparing two floating-point variables the value needs to be correct only up to a certain tolerance limit. To do this, the expected mean value needs to be changed by uncommenting the line in the following part of the code:
28
28
29
29
```python
30
30
# Expected result
31
31
expected_mean =78.33
32
32
# expected_result = pytest.approx(78.3, abs=0.01)
33
33
```
34
34
35
-
-**Comparing floating point variables** needs to be handled in functions like `find_average` and is done using `pytest.approx(value, abs)`. The `abs` value is the tolerance up to which the floating-point value will be checked, that is `78.33 +/- 0.01`.
35
+
-**Comparing floating point variables** needs to be handled in functions like `find_mean` and is done using `pytest.approx(value, abs)`. The `abs` value is the tolerance up to which the floating-point value will be checked, that is `78.33 +/- 0.01`.
36
36
- Even if one test fails, pytest runs all the tests and gives a report on the failing test. The assertion failure report generated my pytest is also more detailed than the usual Python assertion report. When the test fails, the following is observed:
- Putting the tests in a folder `tests/` does not affect the behavior of pytest. When pytest is run from the original directory, the tests are found and run.
76
+
-**Note**: revert to the old directory structure before proceeding to the next section.
76
77
77
78
## unittest
78
79
79
-
- Base class `unittest.TestCase` is used to create a test suite consisting of all the tests of a software.
80
+
- Base class `unittest.TestCase` is used to create a test suite.
80
81
- Each test is now a function of a class which is derived from the class `unittest.TestCase`.
81
82
- The same tests as for `pytest` are implemented using `unittest` in the file `test_operations_unittests.py`. The tests are functions of a class named `TestOperations` which tests our mathematical operations. The class `TestOperations` is derived from `unittest.TestCase`.
82
-
- unittest can be run by:
83
+
- unittest can be run as a Python module:
83
84
84
85
```bash
85
86
python -m unittest
86
87
```
87
88
88
-
- unittest.TestCase offers functions like `assertEqual`, `assertAlmostEqual`, `assertTrue`, etc. for use instead of the usual assertion statements. These statements help the test runner to accumulate all test results and generate a test report.
89
-
-`unittest.main()` provides an option to run the tests from a command-line interface.
89
+
- unittest.TestCase offers functions like `assertEqual`, `assertAlmostEqual`, `assertTrue`, and more ([see unittest.TestCase documentation](https://docs.python.org/3/library/unittest.html#unittest.TestCase)) for use instead of the usual assertion statements. These statements ensure that test runner to accumulate all test results and generate a test report.
90
+
-`unittest.main()` provides an option to run the tests from a command-line interface and also from a file.
90
91
-`setUp` function is executed before all the tests. Similar a clean up function `tearDown` exists.
91
92
- The intention is to group together sets of similar tests in an instant of `unittest.TestCase` and have multiple such instances.
92
93
- Decorators such as `@unittest.skip`, `@unittest.skipIf`, `@unittest.expectedFailure` can be used to gain flexibility over working of tests.
@@ -123,22 +124,21 @@ coverage html
123
124
124
125
## tox
125
126
126
-
- Automation for Python testing (and much more)
127
-
- Virtual environments are created for each task, and tox takes care of installing dependencies and the package itself inside of the environment.
128
-
- Order of preference for files that tox tries to read: `pyproject.toml`, `tox.ini`, `setup.cfg`
129
-
-`tox.ini` file
127
+
- Environment orchestrator to setup and execute various tools for a project.
128
+
-`tox` creates virtual environments to run each tools in.
129
+
-`tox.toml` file:
130
130
131
-
```ini
132
-
[tox]
133
-
envlist = my_env
134
-
skipsdist = true
131
+
```toml
132
+
requires = ["tox>=4"]
133
+
env_list = ["testing"]
135
134
136
-
[testenv]
137
-
deps = pytest
138
-
commands = pytest
135
+
[env.testing]
136
+
description = "Run pytest"
137
+
deps = ["pytest>=8"]
138
+
commands = [["pytest"]]
139
139
```
140
140
141
-
- Global settings defined under section `[tox]` in the INI file.
142
-
- Start tox by running the command `tox` in the directory where the `tox.ini` exists.
141
+
- Global settings defined under section at the top of the `tox.toml` file.
142
+
- Start tox by running the command `tox` in the directory where the `tox.toml` exists.
143
143
- tox takes more time the first time it is run as it creates the necessary virtual environments. Virtual environment setup can be found in the `.tox` repository.
144
144
- Observe that tox starts a virtual environment, installs the dependency (here `pytest`) and runs `pytest`.
- Deadline for submitting this exercise is **Thursday 26th January 2023 09:00**.
7
-
- Structure all the tests in a format similar to what is shown in the [demo code](https://github.com/Simulation-Software-Engineering/Lecture-Material/blob/main/05_testing_and_ci/examples/python_testing).
- Fork the [repository](https://github.com/Simulation-Software-Engineering/testing-python-exercise-wt2223).
23
+
- Fork the [repository](https://github.com/Simulation-Software-Engineering/testing-python-exercise-wt2425).
26
24
- The code in `diffusion2d.py` is in principle the same code used for the Python packaging exercise. The main difference is that now the code has a class `SolveDiffusion2D` which has several member functions.
27
25
- Each function name states what the function does, for example the function `initialize_domain()` takes in input arguments `w` (width), `h` (height), `dx` and `dy` and sets the values to member variables of the class and also calculates the number of points in x and y directions.
28
26
- The functions `initialize_domain` and `initialize_physical_parameters` have default values for all input parameters, hence they can be called without any parameters.
@@ -51,7 +49,7 @@
51
49
- Note that you have the object of the class `SolveDiffusion2D` and hence you can access member variables, for example `solver.nx` and `solver.ny`. This is useful to check the actual values.
52
50
- Using a similar workflow, complete the other two unit tests.
53
51
- Run the tests using `pytest`.
54
-
- It is observed that in some instances `pytest` is not able to find the tests. One reason is the way pytest is installed, which is typically either using `pip` or `apt`. Refer to the [corresponding section](https://github.com/Simulation-Software-Engineering/Lecture-Material/blob/main/05_testing_and_ci/python_testing_demo.md#pytest) in the demo for more details. If such errors occur, then try to explicitly point pytest to the relevant test file. For example:
52
+
- It is observed that in some instances `pytest` is not able to find the tests. One reason is the way pytest is installed, which is typically either using `pip` or `apt`. Refer to the [corresponding section](https://github.com/Simulation-Software-Engineering/Lecture-Material/blob/main/05_testing_and_ci/python_testing_demo.md#pytest) in the demo for more details. If such errors occur, then try to explicitly point pytest to the relevant test file in the following way:
- Using the coverage tool generate a HTML report of the code coverage of all the tests.
101
99
- Open the report file in a browser and print the report to a file called `coverage-report.pdf`. Add this file to the repository.
102
-
-**Note**: coverage can be used with both `pytest` and `unittest`. In this case generating the report of the unit tests using unittest is sufficient.
100
+
-**Note**: coverage can be used with both `pytest` and `unittest`. In this case, generating the report of the unit tests using unittest is sufficient.
103
101
104
102
## Step 7 - Automation Using tox
105
103
106
-
- Write a `tox.ini` file such that by running the command `tox`, both `pytest` and `unittest` are executed.
104
+
- Write a `tox.toml` file such that by running the command `tox`, both `pytest` and `unittest` are executed.
107
105
- Use the `requirements.txt` file to send all the dependencies information to tox.
108
106
109
107
## Step 8 - Submission
110
108
111
-
- Open a pull request titled `Adding tests by <GitLab username>`from your fork to the main repository.
109
+
- Open a pull request titled `[<your GitLab username>] Adding tests` (for example: `[desaiin] Adding tests`) from your fork to the main branch of the exercise repository.
0 commit comments