Skip to content

21 add pre commit hooks #60

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 13 commits into from
Apr 15, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,15 +38,15 @@ jobs:
environment-file: environment.yml
python-version: '3.10'
auto-activate-base: false

- name: Install dependencies
run: |
run: |
python -m pip install --upgrade pip
pip install setuptools --upgrade
pip install ./

- name: Run tests
run: |
run: |
python -m unittest discover -s tests

build-venv:
Expand Down Expand Up @@ -76,4 +76,4 @@ jobs:
- name: Run tests
run: |
source venv/bin/activate
python -m unittest discover -s tests
python -m unittest discover -s tests
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -171,4 +171,4 @@ cython_debug/
.pypirc

# Ignore vscode settings
.vscode/
.vscode/
56 changes: 56 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
ci:
autoupdate_commit_msg: "chore: update pre-commit hooks"
autofix_commit_msg: "style: pre-commit fixes"

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: trailing-whitespace # remove trailing whitespace
- id: end-of-file-fixer # ensure files end with a newline
- id: check-yaml # check YAML files for syntax errors
- id: check-json # check JSON files for syntax errors
- id: check-added-large-files # check for large files
args: ['--maxkb=500'] # set the max file size to 500KB
- id: check-case-conflict # check for case conflicts in filenames.
- id: check-merge-conflict # This hook checks for merge conflict markers in files.
# It ensures that there are no unresolved merge conflicts in the codebase.
- id: check-symlinks # check for broken symlinks
# - id: debug-statements
- id: mixed-line-ending # check for mixed line endings, meaning that
# a file contains both CRLF and LF line endings. This can cause issues
# when working with files across different operating systems.

# - repo: https://github.com/psf/black
# rev: 25.1.0 # Use the latest stable version
# hooks:
# - id: black

# - repo: https://github.com/PyCQA/flake8
# rev: 7.1.1 # Use the latest stable version
# hooks:
# - id: flake8

# - repo: https://github.com/pre-commit/mirrors-isort
# rev: 6.0.0 # Use the latest stable version
# hooks:
# - id: isort

# - repo: https://github.com/astral-sh/ruff-pre-commit
# rev: "v0.11.5"
# hooks:
# # first, lint + autofix
# - id: ruff
# types_or: [python, pyi, jupyter]
# args: ["--fix", "--show-fixes"]
# # then, format
# - id: ruff-format

# - repo: https://github.com/pre-commit/mirrors-mypy
# rev: "v1.15.0"
# hooks:
# - id: mypy
# files: src
# args: []
# additional_dependencies:
# - pytest
47 changes: 47 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
See the [Scientific Python Developer Guide][spc-dev-intro] for a detailed
description of best practices for developing scientific packages.

[spc-dev-intro]: https://learn.scientific-python.org/development/

# Setting up a development environment manually

You can set up a development environment by running:

```zsh
python3 -m venv venv # create a virtualenv called venv
source ./venv/bin/activate # now `python` points to the virtualenv python
pip install -v -e ".[dev]" # -v for verbose, -e for editable, [dev] for dev dependencies
```

# Post setup

You should prepare pre-commit, which will help you by checking that commits pass
required checks:

```bash
pip install pre-commit # or brew install pre-commit on macOS
pre-commit install # this will install a pre-commit hook into the git repo
```

You can also/alternatively run `pre-commit run` (changes only) or
`pre-commit run --all-files` to check even without installing the hook.

# Testing

This repo uses `unittest` for testing. You can run locally the tests by running the following command:

```bash
python -m unittest discover -s tests
```
there is also a autamtated test pipeline that runs the tests on every push to the repository (see [here](.github/workflows/ci.yml)).


# Pre-commit

This project uses pre-commit for all style checking. Install pre-commit and run:

```bash
pre-commit run -a
```

to check all files.
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
The scope of this package is to provide a framework for building **0D models** and **simulating cardiovascular flow** and **mechanics**. Conceptually, the models can be split into three types of components:
1. **Heart chambers**
2. **Valves**
3. **Vessels**
3. **Vessels**

## Clone the ModularCirc GitHub repo locally

Expand All @@ -26,7 +26,7 @@ Before installation of the ModularCirc package, please setup a virtual environme

Install Conda from https://docs.conda.io/projects/conda/en/stable/user-guide/install/index.html

Run:
Run:

```
conda create --name <yourenvname>
Expand All @@ -37,15 +37,15 @@ Proceed to installing the ModularCirc package.

### Python virtual environment setup

Run `python3 -m venv venv`. This creates a virtual environment called `venv` in your base directory.
Run `python3 -m venv venv`. This creates a virtual environment called `venv` in your base directory.

Activate the python environment: `source venv/bin/activate`

Proceed to installing the ModularCirc package.

## Installation

To install the pip package:
To install the pip package:

```bash
python -m pip install ModularCirc_LevanBokeria
Expand Down Expand Up @@ -79,14 +79,14 @@ TEMPLATE_TIME_SETUP_DICT = {
'name' : 'TimeTest',
'ncycles' : 40,
'tcycle' : 1.0,
'dt' : 0.001,
'dt' : 0.001,
'export_min' : 1
}
```
Here, `ncycles` indicates the maximum number of heart cycles to run, before the simulation finishes.
If the simulation reaches steady state faster than that, the simulation will end provided the number of cycles is higher than `export_min`.
`tcycle` indicates the duration of the heart beat and `dt` represent the time step size used in the temporal discretization.
These measurements assume that time is measured in **seconds**.
If the simulation reaches steady state faster than that, the simulation will end provided the number of cycles is higher than `export_min`.
`tcycle` indicates the duration of the heart beat and `dt` represent the time step size used in the temporal discretization.
These measurements assume that time is measured in **seconds**.
If the units used are different, ensure this is done consistently in line with other parameters.

4. Create an instance of the parameter object and used it to change the default values:
Expand Down
4 changes: 2 additions & 2 deletions Tutorials/Tutorial_01/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Involved in a larger project involving patients with pulmonary arterial hyperten

Currently, while the monitors provide a lot of cardiac data, the usefulness of the data that they provide is limited, as it is difficult to interpret and use the information to make approprate changes to patient management.

The project focuses on the development of a digital twin to aid with interpretation of the cardiac data from these monitors.
The project focuses on the development of a digital twin to aid with interpretation of the cardiac data from these monitors.

The sensitivity analysis aims to focus in on the parameters from the model that most affect pulmonary arterial pressure and cardiac output.

Expand All @@ -28,4 +28,4 @@ Using a reduced set of parameters, you can more efficiently fit the model to pat
4) Complete K fold cross validation.
5) Retrain model on all the data with the new reduced number of components.
6) Use the reduced PCA results as output and the original parameter set you created as input for emulation - use [Autoemulate](https://github.com/alan-turing-institute/autoemulate) to find the best emulator for the data, this uses the "step3" notebook.
7) Conduct a sensitivity analysis using the results from emulation using SAlib.
7) Conduct a sensitivity analysis using the results from emulation using SAlib.
22 changes: 11 additions & 11 deletions Tutorials/Tutorial_01/circ_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@ def signal_get_pulse(signal, dt, num=100):
Returns:
_type_: _description_
"""
ind = np.argmin(signal)

ind = np.argmin(signal)
ncycle = len(signal)
new_signal = np.interp(np.linspace(0, ncycle, num), np.arange(ncycle), np.roll(signal, -ind))
new_dt = ncycle / (num - 1) * dt
Expand Down Expand Up @@ -80,8 +80,8 @@ def run_case(row, output_path, N_cycles, dt):
) # replace the .. with the correct Class and inputs if applicable

solver.setup(
suppress_output=True,
optimize_secondary_sv=False,
suppress_output=True,
optimize_secondary_sv=False,
conv_cols=["p_ao"],
method='LSODA'
)
Expand Down Expand Up @@ -138,7 +138,7 @@ def simulation_loader(input_directory):

file_series.sort_values('Index', inplace=True)
file_series.set_index('Index', inplace=True)

# Define a dataframe for the values collected from the simulation...
signal_df = file_series.apply(
lambda row: list(pd.read_csv(os.path.join(input_directory, row['file']), index_col='Index').to_numpy().reshape((-1))),
Expand All @@ -147,7 +147,7 @@ def simulation_loader(input_directory):
signal_df.rename(columns=template_columns, inplace=True)

return signal_df


dict_parameters_condensed_range = dict()
dict_parameters_condensed_single = dict()
Expand All @@ -166,7 +166,7 @@ def condense_dict_parameters(dict_param:dict, prev=""):
dict_parameters_condensed_range[new_key] = tuple(np.array(r) * value)
else:
dict_parameters_condensed_single[new_key] = val[0]
return
return


######## DEFINED A FUNCTION TO PLOT THE VARIANCE
Expand All @@ -186,7 +186,7 @@ def plot_variance(pca, width=8, dpi=100):
cumulative_explained_variance = np.cumsum(explained_variance_ratio)
axs[1].semilogy(grid, cumulative_explained_variance, "o-")
axs[1].set(
xlabel="Component", title="% Cumulative Variance",
xlabel="Component", title="% Cumulative Variance",
)
# Set up figure
fig.set(figwidth=8, dpi=100)
Expand All @@ -207,13 +207,13 @@ def scale_time_parameters_and_asign_to_components(df):
# 800 ms in this case

df['la.delay'] = df['la.delay'] * df['T'] / 800.

df['la.t_tr'] = df['la.t_tr'] * df['T'] / 800.
df['lv.t_tr'] = df['lv.t_tr'] * df['T'] / 800.

df['la.tau'] = df['la.tau'] * df['T'] / 800.
df['lv.tau'] = df['lv.tau'] * df['T'] / 800.

df['la.t_max'] = df['la.t_max'] * df['T'] / 800.
df['lv.t_max'] = df['lv.t_max'] * df['T'] / 800.
return
return
Loading