Skip to content

Commit 8050811

Browse files
authored
Merge pull request #61 from alan-turing-institute/58-separate-out-tested-models
This pull request merges the release branch #58 into `main`. The release branch has only the Naghavi and KorakianitisMixedModel as tested models, all the other ones have been removed. READMEs have been updated accordingly.
2 parents 6a94096 + 1425d63 commit 8050811

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

53 files changed

+790
-1838
lines changed

.github/workflows/ci.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -38,15 +38,15 @@ jobs:
3838
environment-file: environment.yml
3939
python-version: '3.10'
4040
auto-activate-base: false
41-
41+
4242
- name: Install dependencies
43-
run: |
43+
run: |
4444
python -m pip install --upgrade pip
4545
pip install setuptools --upgrade
4646
pip install ./
4747
4848
- name: Run tests
49-
run: |
49+
run: |
5050
python -m unittest discover -s tests
5151
5252
build-venv:
@@ -76,4 +76,4 @@ jobs:
7676
- name: Run tests
7777
run: |
7878
source venv/bin/activate
79-
python -m unittest discover -s tests
79+
python -m unittest discover -s tests

.github/workflows/publish.yml

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
name: Upload Python Package to PyPI when a Release is Created
2+
3+
on:
4+
release:
5+
types: [created]
6+
7+
jobs:
8+
pypi-publish:
9+
name: Publish release to PyPI
10+
runs-on: ubuntu-latest
11+
environment:
12+
name: pypi_release
13+
url: https://pypi.org/p/ModularCirc
14+
permissions:
15+
id-token: write
16+
steps:
17+
- uses: actions/checkout@v4
18+
- name: Set up Python
19+
uses: actions/setup-python@v4
20+
with:
21+
python-version: "3.10"
22+
- name: Install dependencies
23+
run: |
24+
python -m pip install --upgrade pip
25+
pip install setuptools wheel build
26+
- name: Build package
27+
run: |
28+
python -m build
29+
- name: Publish package distributions to PyPI
30+
uses: pypa/gh-action-pypi-publish@release/v1

.gitignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -171,4 +171,4 @@ cython_debug/
171171
.pypirc
172172

173173
# Ignore vscode settings
174-
.vscode/
174+
.vscode/

.pre-commit-config.yaml

Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
ci:
2+
autoupdate_commit_msg: "chore: update pre-commit hooks"
3+
autofix_commit_msg: "style: pre-commit fixes"
4+
5+
repos:
6+
- repo: https://github.com/pre-commit/pre-commit-hooks
7+
rev: v5.0.0
8+
hooks:
9+
- id: trailing-whitespace # remove trailing whitespace
10+
- id: end-of-file-fixer # ensure files end with a newline
11+
- id: check-yaml # check YAML files for syntax errors
12+
- id: check-json # check JSON files for syntax errors
13+
- id: check-added-large-files # check for large files
14+
args: ['--maxkb=500'] # set the max file size to 500KB
15+
- id: check-case-conflict # check for case conflicts in filenames.
16+
- id: check-merge-conflict # This hook checks for merge conflict markers in files.
17+
# It ensures that there are no unresolved merge conflicts in the codebase.
18+
- id: check-symlinks # check for broken symlinks
19+
# - id: debug-statements
20+
- id: mixed-line-ending # check for mixed line endings, meaning that
21+
# a file contains both CRLF and LF line endings. This can cause issues
22+
# when working with files across different operating systems.
23+
24+
# - repo: https://github.com/psf/black
25+
# rev: 25.1.0 # Use the latest stable version
26+
# hooks:
27+
# - id: black
28+
29+
# - repo: https://github.com/PyCQA/flake8
30+
# rev: 7.1.1 # Use the latest stable version
31+
# hooks:
32+
# - id: flake8
33+
34+
# - repo: https://github.com/pre-commit/mirrors-isort
35+
# rev: 6.0.0 # Use the latest stable version
36+
# hooks:
37+
# - id: isort
38+
39+
# - repo: https://github.com/astral-sh/ruff-pre-commit
40+
# rev: "v0.11.5"
41+
# hooks:
42+
# # first, lint + autofix
43+
# - id: ruff
44+
# types_or: [python, pyi, jupyter]
45+
# args: ["--fix", "--show-fixes"]
46+
# # then, format
47+
# - id: ruff-format
48+
49+
# - repo: https://github.com/pre-commit/mirrors-mypy
50+
# rev: "v1.15.0"
51+
# hooks:
52+
# - id: mypy
53+
# files: src
54+
# args: []
55+
# additional_dependencies:
56+
# - pytest

CONTRIBUTING.md

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
See the [Scientific Python Developer Guide][spc-dev-intro] for a detailed
2+
description of best practices for developing scientific packages.
3+
4+
[spc-dev-intro]: https://learn.scientific-python.org/development/
5+
6+
# Setting up a development environment manually
7+
8+
You can set up a development environment by running:
9+
10+
```zsh
11+
python3 -m venv venv # create a virtualenv called venv
12+
source ./venv/bin/activate # now `python` points to the virtualenv python
13+
pip install -v -e ".[dev]" # -v for verbose, -e for editable, [dev] for dev dependencies
14+
```
15+
16+
# Post setup
17+
18+
You should prepare pre-commit, which will help you by checking that commits pass
19+
required checks:
20+
21+
```bash
22+
pip install pre-commit # or brew install pre-commit on macOS
23+
pre-commit install # this will install a pre-commit hook into the git repo
24+
```
25+
26+
You can also/alternatively run `pre-commit run` (changes only) or
27+
`pre-commit run --all-files` to check even without installing the hook.
28+
29+
# Testing
30+
31+
This repo uses `unittest` for testing. You can run locally the tests by running the following command:
32+
33+
```bash
34+
python -m unittest discover -s tests
35+
```
36+
there is also a autamtated test pipeline that runs the tests on every push to the repository (see [here](.github/workflows/ci.yml)).
37+
38+
39+
# Pre-commit
40+
41+
This project uses pre-commit for all style checking. Install pre-commit and run:
42+
43+
```bash
44+
pre-commit run -a
45+
```
46+
47+
to check all files.

README.md

Lines changed: 29 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -7,16 +7,13 @@
77
The scope of this package is to provide a framework for building **0D models** and **simulating cardiovascular flow** and **mechanics**. Conceptually, the models can be split into three types of components:
88
1. **Heart chambers**
99
2. **Valves**
10-
3. **Vessels**
10+
3. **Vessels**
1111

12-
## Clone the ModularCirc GitHub repo locally
12+
The current version of the published package contains two models:
13+
1. Naghavi model.
14+
2. Korakianitis Mixed model.
1315

14-
Run:
15-
16-
```
17-
git clone https://github.com/alan-turing-institute/ModularCirc
18-
cd ModularCirc
19-
```
16+
For other models currently under development, see the `dev` branch.
2017

2118
## Setup Conda or python virtual environment
2219

@@ -26,7 +23,7 @@ Before installation of the ModularCirc package, please setup a virtual environme
2623

2724
Install Conda from https://docs.conda.io/projects/conda/en/stable/user-guide/install/index.html
2825

29-
Run:
26+
Run:
3027

3128
```
3229
conda create --name <yourenvname>
@@ -37,21 +34,32 @@ Proceed to installing the ModularCirc package.
3734

3835
### Python virtual environment setup
3936

40-
Run `python3 -m venv venv`. This creates a virtual environment called `venv` in your base directory.
37+
Run `python3 -m venv venv`. This creates a virtual environment called `venv` in your base directory.
4138

4239
Activate the python environment: `source venv/bin/activate`
4340

4441
Proceed to installing the ModularCirc package.
4542

46-
## Installation
43+
## Installing ModularCirc
44+
45+
### pip install
4746

48-
To install the pip package:
47+
To install the pip package:
4948

5049
```bash
51-
python -m pip install ModularCirc_LevanBokeria
50+
python -m pip install ModularCirc
5251
```
5352

54-
From source:
53+
### Installation from source:
54+
55+
Clone the ModularCirc GitHub repo locally.
56+
57+
Run:
58+
59+
```
60+
git clone https://github.com/alan-turing-institute/ModularCirc
61+
cd ModularCirc
62+
```
5563

5664
After downloading the GitHub repository, from the repo directory run:
5765

@@ -79,14 +87,14 @@ TEMPLATE_TIME_SETUP_DICT = {
7987
'name' : 'TimeTest',
8088
'ncycles' : 40,
8189
'tcycle' : 1.0,
82-
'dt' : 0.001,
90+
'dt' : 0.001,
8391
'export_min' : 1
8492
}
8593
```
8694
Here, `ncycles` indicates the maximum number of heart cycles to run, before the simulation finishes.
87-
If the simulation reaches steady state faster than that, the simulation will end provided the number of cycles is higher than `export_min`.
88-
`tcycle` indicates the duration of the heart beat and `dt` represent the time step size used in the temporal discretization.
89-
These measurements assume that time is measured in **seconds**.
95+
If the simulation reaches steady state faster than that, the simulation will end provided the number of cycles is higher than `export_min`.
96+
`tcycle` indicates the duration of the heart beat and `dt` represent the time step size used in the temporal discretization.
97+
These measurements assume that time is measured in **seconds**.
9098
If the units used are different, ensure this is done consistently in line with other parameters.
9199

92100
4. Create an instance of the parameter object and used it to change the default values:
@@ -122,18 +130,10 @@ p_lv = solver.model.components['lv'].P_i.values
122130
## Example values pv loops for all 4 chambers:
123131
![Example PV loops!](Figures/PV_loops.png)
124132

125-
## Run tests
126-
127-
You can run locally the tests by running the following command:
128-
```bash
129-
python -m unittest discover -s tests
130-
```
131-
there is also a autamtated test pipeline that runs the tests on every push to the repository (see [here](.github/workflows/ci.yml)).
132-
133133
<!-- prettier-ignore-start -->
134134
[actions-badge]: https://github.com/alan-turing-institute/ModularCirc/workflows/CI/badge.svg
135135
[actions-link]: https://github.com/alan-turing-institute/ModularCirc/actions
136-
[pypi-link]: https://test.pypi.org/project/ModularCirc-LevanBokeria/
137-
[pypi-platforms]: https://img.shields.io/pypi/pyversions/ModularCirc-LevanBokeria
138-
[pypi-version]: https://img.shields.io/pypi/v/ModularCirc-LevanBokeria
136+
[pypi-link]: https://pypi.org/project/ModularCirc
137+
[pypi-platforms]: https://img.shields.io/pypi/pyversions/ModularCirc
138+
[pypi-version]: https://img.shields.io/pypi/v/ModularCirc
139139
<!-- prettier-ignore-end -->

Tutorials/Tutorial_01/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Involved in a larger project involving patients with pulmonary arterial hyperten
66

77
Currently, while the monitors provide a lot of cardiac data, the usefulness of the data that they provide is limited, as it is difficult to interpret and use the information to make approprate changes to patient management.
88

9-
The project focuses on the development of a digital twin to aid with interpretation of the cardiac data from these monitors.
9+
The project focuses on the development of a digital twin to aid with interpretation of the cardiac data from these monitors.
1010

1111
The sensitivity analysis aims to focus in on the parameters from the model that most affect pulmonary arterial pressure and cardiac output.
1212

@@ -28,4 +28,4 @@ Using a reduced set of parameters, you can more efficiently fit the model to pat
2828
4) Complete K fold cross validation.
2929
5) Retrain model on all the data with the new reduced number of components.
3030
6) Use the reduced PCA results as output and the original parameter set you created as input for emulation - use [Autoemulate](https://github.com/alan-turing-institute/autoemulate) to find the best emulator for the data, this uses the "step3" notebook.
31-
7) Conduct a sensitivity analysis using the results from emulation using SAlib.
31+
7) Conduct a sensitivity analysis using the results from emulation using SAlib.

Tutorials/Tutorial_01/circ_utils.py

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,8 @@ def signal_get_pulse(signal, dt, num=100):
2626
Returns:
2727
_type_: _description_
2828
"""
29-
30-
ind = np.argmin(signal)
29+
30+
ind = np.argmin(signal)
3131
ncycle = len(signal)
3232
new_signal = np.interp(np.linspace(0, ncycle, num), np.arange(ncycle), np.roll(signal, -ind))
3333
new_dt = ncycle / (num - 1) * dt
@@ -80,8 +80,8 @@ def run_case(row, output_path, N_cycles, dt):
8080
) # replace the .. with the correct Class and inputs if applicable
8181

8282
solver.setup(
83-
suppress_output=True,
84-
optimize_secondary_sv=False,
83+
suppress_output=True,
84+
optimize_secondary_sv=False,
8585
conv_cols=["p_ao"],
8686
method='LSODA'
8787
)
@@ -138,7 +138,7 @@ def simulation_loader(input_directory):
138138

139139
file_series.sort_values('Index', inplace=True)
140140
file_series.set_index('Index', inplace=True)
141-
141+
142142
# Define a dataframe for the values collected from the simulation...
143143
signal_df = file_series.apply(
144144
lambda row: list(pd.read_csv(os.path.join(input_directory, row['file']), index_col='Index').to_numpy().reshape((-1))),
@@ -147,7 +147,7 @@ def simulation_loader(input_directory):
147147
signal_df.rename(columns=template_columns, inplace=True)
148148

149149
return signal_df
150-
150+
151151

152152
dict_parameters_condensed_range = dict()
153153
dict_parameters_condensed_single = dict()
@@ -166,7 +166,7 @@ def condense_dict_parameters(dict_param:dict, prev=""):
166166
dict_parameters_condensed_range[new_key] = tuple(np.array(r) * value)
167167
else:
168168
dict_parameters_condensed_single[new_key] = val[0]
169-
return
169+
return
170170

171171

172172
######## DEFINED A FUNCTION TO PLOT THE VARIANCE
@@ -186,7 +186,7 @@ def plot_variance(pca, width=8, dpi=100):
186186
cumulative_explained_variance = np.cumsum(explained_variance_ratio)
187187
axs[1].semilogy(grid, cumulative_explained_variance, "o-")
188188
axs[1].set(
189-
xlabel="Component", title="% Cumulative Variance",
189+
xlabel="Component", title="% Cumulative Variance",
190190
)
191191
# Set up figure
192192
fig.set(figwidth=8, dpi=100)
@@ -207,13 +207,13 @@ def scale_time_parameters_and_asign_to_components(df):
207207
# 800 ms in this case
208208

209209
df['la.delay'] = df['la.delay'] * df['T'] / 800.
210-
210+
211211
df['la.t_tr'] = df['la.t_tr'] * df['T'] / 800.
212212
df['lv.t_tr'] = df['lv.t_tr'] * df['T'] / 800.
213-
213+
214214
df['la.tau'] = df['la.tau'] * df['T'] / 800.
215215
df['lv.tau'] = df['lv.tau'] * df['T'] / 800.
216216

217217
df['la.t_max'] = df['la.t_max'] * df['T'] / 800.
218218
df['lv.t_max'] = df['lv.t_max'] * df['T'] / 800.
219-
return
219+
return

0 commit comments

Comments
 (0)