Skip to content

Commit c9773eb

Browse files
committed
update links
1 parent 974afd8 commit c9773eb

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
This is a work-in-progress project to implement some components of the University of Washington [TOPMed pipeline](https://github.com/UW-GAC/analysis_pipeline) into Workflow Description Lauange (WDL) in a way that closely mimics [the CWL version of the UW Pipeline](https://github.com/UW-GAC/analysis_pipeline_cwl). In other words, this is a WDL that mimics a CWL that mimics a Python pipeline. All three pipelines use the same underlying R scripts which do most of the heavy lifting, making their results directly comparable.
55

66
## Features
7-
* This pipeline is very similiar to the CWL version, and while the main differences between the two [are documented](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/documentation/cwl-vs-wdl.md), testing indicates they are functionally equivalent -- so much so that files generated by the CWL are used as truth files for the WDL
7+
* This pipeline is very similiar to the CWL version, and while the main differences between the two [are documented](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/_documentation_/for%20users/cwl-vs-wdl-user.md), testing indicates they are functionally equivalent -- so much so that files generated by the CWL are used as truth files for the WDL
88
* As it works in a Docker container, it does not have any external dependencies other than the usual setup required for [WDL](https://software.broadinstitute.org/wdl/documentation/quickstart) and [Cromwell](http://cromwell.readthedocs.io/en/develop/)
99
* Contains a checker workflow for validating a set of known inputs and expected outputs
1010

@@ -15,10 +15,10 @@ The original pipeline had arguments relating to runtime such as `ncores` and `cl
1515
### Terra users
1616
For Terra users, it is recommended to import via Dockstore. Importing the correct JSON file for your workflow at the workflow field entry page will fill in test data and recommended runtime attributes for said test data. For example, load `vcf-to-gds-terra.json` for `vcf-to-gds.wfl`. If you are using your own data, please be sure to increase your runtime attributes appropriately.
1717
### Local users
18-
Running these workflows locally is technically possible, but this is not officially supported due to how the local version of Cromwell handles local resources. Please see [this document](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/documentation/running-locally.md) for specifics on running locally.
18+
Running these workflows locally is technically possible, but this is not officially supported due to how the local version of Cromwell handles local resources. Please see [this document](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/_documentation_/for%20users/running-locally.md) for specifics on running locally.
1919

2020
## Further reading
21-
* [checker workflows](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/_documentation_/checker.md)
21+
* [checker workflows](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/_documentation_/for%20users/checker.md)
2222
* [ld-pruning](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/ld-pruning/README.md)
2323
* [null-model](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/null-model/README.md)
2424
* [vcf-to-gds](https://github.com/DataBiosphere/analysis_pipeline_WDL/blob/main/vcf-to-gds/README.md)

0 commit comments

Comments
 (0)