You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+4-4Lines changed: 4 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -47,19 +47,19 @@ The entire contributing process consists in the following steps:
47
47
48
48
## Workflow
49
49
50
-
We use gitlab environment git workflow. The default branch is `dev` and the publishing branch is `prod`. The working branches are created from `dev` and must respect the following steps and actions:
50
+
We use gitlab environment git workflow. The default branch is `dev` and the publishing branch is `main`. The working branches are created from `dev` and must respect the following steps and actions:
51
51
52
52
1. A new branch is created from `dev`.
53
53
1. After finish working with it, a PR to `dev` must be created.
54
54
1. Automatic Action/Workflow for PR is executed.
55
55
1. The new branch is merged to `dev`.
56
56
1. Automatic Action/Workflow for _Push_ events into `dev` is executed.
57
-
1. When is ready to publish a new version of `dev`, a PR to `prod` is created.
57
+
1. When is ready to publish a new version of `dev`, a PR to `main` is created.
58
58
1. These Action/Workflow are executed:
59
59
1. PR.
60
60
1. Version checker (to avoid overwrite an existing image on Docker Hub repository).
61
-
1.`dev` is merged into `prod`.
62
-
1. Automatic Action/Workflow for _Push_ events into `prod` is executed to build a new Docker image for Modulector and publish it.
61
+
1.`dev` is merged into `main`.
62
+
1. Automatic Action/Workflow for _Push_ events into `main` is executed to build a new Docker image for Modulector and publish it.
1. Make a copy of `docker-compose_dist.yml` with the name `docker-compose.yml`.
20
20
1. Set the environment variables that are empty with data. They are listed below by category:
21
21
- Django:
22
22
- `DJANGO_SETTINGS_MODULE`: indicates the `settings.py` file to read. In production, we setin`docker-compose_dist.yml` the value `ModulectorBackend.settings_prod` which contains several production properties.
23
23
- `ALLOWED_HOSTS`: list of allowed host separated by commas. Default `['web', '.localhost', '127.0.0.1', '[::1]']`.
24
24
- `ENABLE_SECURITY`: set the string `true` to enable Django's security mechanisms. In addition to this parameter, to have a secure site you must configure the HTTPS server, for more information on the latter see the section [Enable SSL/HTTPS](#enable-sslhttps). Default `false`.
25
-
- `CSRF_TRUSTED_ORIGINS`: in Django >= 4.x, it's mandatory to define this in production when you are using Daphne through NGINX. The value is a single host or list of hosts separated by commas. 'http://', 'https://' prefixes are mandatory. Examples of values: 'http://127.0.0.1', 'http://127.0.0.1,https://127.0.0.1:8000', etc. You can read more [here][csrf-trusted-issue].
25
+
- `CSRF_TRUSTED_ORIGINS`: in Django >= 4.x, it's mandatory to define this in production when you are using Daphne through NGINX. The value is a single host or list of hosts separated by commas. 'http://', 'https://' prefixes are mandatory. Examples of values: '<http://127.0.0.1>', '<http://127.0.0.1,https://127.0.0.1:8000>', etc. You can read more [here][csrf-trusted-issue].
26
26
- `SECRET_KEY`: Django's secret key. If not specified, one is generated with [generate-secret-key application](https://github.com/MickaelBergem/django-generate-secret-key) automatically.
27
27
- `MEDIA_ROOT`: absolute path where will be stored the uploaded files. By default `<project root>/uploads`.
28
28
- `MEDIA_URL`: URL of the `MEDIA_ROOT` folder. By default `<url>/media/`.
@@ -34,7 +34,7 @@ Below are the steps to perform a production deploy.
34
34
- `POSTGRES_PORT` : Database server listen port. By default, the docker image uses `5432`.
35
35
- `POSTGRES_DB` : Database name to be used. By default, the docker image uses `modulector`.
36
36
- Health-checks and alerts:
37
-
- `HEALTH_URL` : indicates the url that will be requested on Docker health-checks. By default, it is http://localhost:8000/drugs/. The healthcheck makes a GET request on it. Any HTTP code value greater or equals than 400 is considered an error.
37
+
- `HEALTH_URL` : indicates the url that will be requested on Docker health-checks. By default, it is <http://localhost:8000/drugs/>. The healthcheck makes a GET request on it. Any HTTP code value greater or equals than 400 is considered an error.
38
38
- `HEALTH_ALERT_URL` : if you want to receive an alert when health-checks failed, you can set this variable to a webhook endpoint that will receive a POST request and a JSON body with the field **content** that contains the fail message.
39
39
1. Go back to the project's root folder and run the following commands:
40
40
- Docker Compose:
@@ -49,62 +49,57 @@ Below are the steps to perform a production deploy.
49
49
1. Run: `python3 manage.py createsuperuser`
50
50
1. Exit the container: `exit`
51
51
52
-
53
52
### Start delays
54
53
55
54
Due to the database restoration in the first start, the container `db_modulector` may take a while to be up a ready. We can follow the status of the startup process in the logs by doing: `docker compose logs --follow`.
56
55
Sometimes this delay makes django server throws database connection errors. If it is still down and not automatically fixed when database is finally up, we can restart the services by doing: `docker compose up -d`.
57
56
58
-
59
57
## Enable SSL/HTTPS
60
58
61
59
To enable HTTPS, follow the steps below:
62
60
63
61
1. Set the `ENABLE_SECURITY` parameter to `true` as explained in the [Instructions](#instructions) section.
64
-
1. Copy the file `config/nginx/multiomics_intermediate_safe_dist.conf` and paste it into `config/nginx/conf.d/` with the name `multiomics_intermediate.conf`.
65
-
1. Get the `.crt` and `.pem` files forboth the certificate and the private key and put themin the `config/nginx/certificates` folder.
66
-
1. Edit the `multiomics_intermediate.conf` file that we pasted in point 2. Uncomment the lines where both `.crt` and `.pem` files must be specified.
67
-
1. Edit the `docker-compose.yml` file so that the `nginx` service exposes both port 8000 and 443. Also, you need to add `certificates` folder to `volumes` section. It should look something like this:
2. Copy the file `config/nginx/multiomics_intermediate_safe_dist.conf` and paste it into `config/nginx/conf.d/` with the name `multiomics_intermediate.conf`.
63
+
3. Get the `.crt` and `.pem` files forboth the certificate and the private key and put themin the `config/nginx/certificates` folder.
64
+
4. Edit the `multiomics_intermediate.conf` file that we pasted in point 2. Uncomment the lines where both `.crt` and `.pem` files must be specified.
65
+
5. Edit the `docker-compose.yml` file so that the `nginx` service exposes both port 8000 and 443. Also, you need to add `certificates` folder to `volumes` section. It should look something like this:
Django provides in its official documentation a configuration checklist that must be present in the production file `settings_prod.py`. To verify that everything is fulfilled, you could execute the following command**once the server is up (this is because several environment variables are required that are setin the `docker-compose.yml`)**.
If the configuration of the `docker-compose.yml` file has been changed, you can apply the changes without stopping the services, just running the `docker compose restart` command.
104
100
105
101
If you want to stop all services, you can execute the command`docker compose down`.
106
102
107
-
108
103
## See container status
109
104
110
105
To check the different services' status you can run:
@@ -113,7 +108,6 @@ To check the different services' status you can run:
113
108
114
109
Where *\<service's name\>* could be `nginx_modulector`, `web_modulector` or `db_modulector`.
115
110
116
-
117
111
## Creating Dumps and Restoring from Dumps
118
112
119
113
### Export
@@ -124,23 +118,20 @@ In order to create a database dump you can execute the following command:
124
118
125
119
That command will create a compressed file with the database dump inside.
126
120
127
-
128
121
### Import
129
122
130
123
You can use set Modulector DB in two ways.
131
124
132
-
133
125
### Importing an existing database dump (recommended)
134
126
135
-
1. Start up a PostgreSQL service. You can use the same service listed in the `docker-compose.dev.yml` file. Run `docker compose -f docker-compose.dev.yml up -d db_modulector` to start the DB service.
127
+
1. Start up a PostgreSQL service. You can use the same service listed in the `docker-compose.dev.yml` file. Run `docker compose -f docker-compose.dev.yml up -d db_modulector` to start the DB service.
136
128
1. **Optional but recommended (you can omit these steps if it's the first time you are deploying Modulector)**: due to major changes, it's probably that an import thrown several errors when importing. To prevent that you could do the following steps before doing the importation:
137
129
1. Drop all the tables from the DB: `docker exec -i [name of the DB container] psql postgres -U postgres -c "DROP DATABASE modulector;"`
138
130
1. Create an empty database: `docker exec -i [name of the DB container] psql postgres -U postgres -c "CREATE DATABASE modulector;"`
139
131
1. Download `.sql.gz` from [Modulector releases pages](https://github.com/omics-datascience/modulector/releases) or use your own export file.
140
132
1. Restore the database: `zcat modulector.sql.gz | docker exec -i [name of the DB container] psql modulector -U modulector`. This command will restore the database using a compressed dump as source, **keep in mind that could take several minutes to finish the process**.
141
133
- **NOTE**: in case you are working on Windows, the command must be executed from [Git Bash][git-bash] or WSL.
142
134
143
-
144
135
### Regenerating the data manually
145
136
146
137
1. Download the files for the mirDIP database (version 5.2) and the Illumina 'Infinium MethylationEPIC 2.0' array. The files can be freely downloaded from their respective web pages.
@@ -152,38 +143,35 @@ You can use set Modulector DB in two ways.
152
143
**For the EPIC Methylation array**:
153
144
- Go to the [Illumina product files web page](https://support.illumina.com/downloads/infinium-methylationepic-v2-0-product-files.html) and download the ZIP file called "*Infinium MethylationEPIC v2.0 Product Files (ZIP Format)*".
154
145
- Unzip the file.
155
-
- Within the unzipped files you will find one called "*EPIC-8v2-0_A1.csv*". Move this file to the directory **"modulector/files/"**.
146
+
- Within the unzipped files you will find one called "*EPIC-8v2-0_A1.csv*". Move this file to the directory **"modulector/files/"**.
156
147
- **NOTE:** the total weight of both files is about 5 GB.
157
148
158
149
**For the mirBase database**: this database is embedded as it weighs only a few MBs. Its data is processed in Django migrations during the execution of the `python3 manage.py migrate` command. So, you don't have to do manual steps to incorporate mirBase data inside Modulector.
159
-
1. Start up a PostgreSQL service. You can use the same service listed in the _docker-compose.dev.yml_ file.
150
+
1. Start up a PostgreSQL service. You can use the same service listed in the *docker-compose.dev.yml* file.
160
151
1. Run `python3 manage.py migrate` to apply all the migrations (**NOTE:** this can take a long time to finish).
161
152
162
-
163
153
## Update databases
164
154
165
155
Modulector currently works with the mirDIP (version 5.2) and miRBase (version 22.1) databases for miRNA data, and with information from the Illumina 'Infinium MethylationEPIC 2.0' array for information about methylation sites.
166
156
If new versions are released for these databases, and you want to update them, follow these steps:
167
157
168
-
- For **mirDIP** and **Illumina EPIC array** you must follow the same steps described in the [Regenerating the data manually](#regenerating-the-data-manually) section, replacing the named files with the most recent versions that have been published on their sites .
169
-
- For **miRBase**, follow the instructions below:
170
-
1. Go to the [_Download_ section on the website][mirbase-download-page].
171
-
1. Download the files named _hairpin.fa_ and _mature.fa_ from the latest version of the database.
172
-
1. Replace the files inside the _modulector/files/_ directory with the ones downloaded in the previous step.
173
-
1. Start up a PostgreSQL service. You can use the same service listed in the _docker-compose.dev.yml_ file.
158
+
- For **mirDIP** and **Illumina EPIC array** you must follow the same steps described in the [Regenerating the data manually](#regenerating-the-data-manually) section, replacing the named files with the most recent versions that have been published on their sites .
159
+
- For **miRBase**, follow the instructions below:
160
+
1. Go to the [*Download* section on the website][mirbase-download-page].
161
+
1. Download the files named *hairpin.fa* and *mature.fa* from the latest version of the database.
162
+
1. Replace the files inside the *modulector/files/* directory with the ones downloaded in the previous step.
163
+
1. Start up a PostgreSQL service. You can use the same service listed in the *docker-compose.dev.yml* file.
174
164
1. Run the command`python3 manage.py migrate` to apply all the migrations (**NOTE:** this can take a long time to finish).
175
165
176
166
**Note:** These updates will work correctly as long as they maintain the format of the data in the source files.
177
167
178
-
179
168
## Configure your API key
180
169
181
170
When we notify user about updates of pubmeds they are subscribed to we interact with a ncbi api that uses an API_KEY, by default, we left a random API_KEY pre-configured in our settings file, you should replace it with your own.
182
171
183
-
184
172
## Cron job configuration
185
-
For cron jobs we use the following [library](https://github.com/kraiz/django-crontab). In our settings file we configured our cron jobs inside the `CRONJOBS = []`
186
173
174
+
For cron jobs we use the following [library](https://github.com/kraiz/django-crontab). In our settings file we configured our cron jobs inside the `CRONJOBS = []`
0 commit comments