Skip to content

Commit 49c0560

Browse files
authored
Merge pull request #97 from Julia-Tempering/fix-docs
fix documentation under the hood
2 parents a7738c5 + cb912b5 commit 49c0560

24 files changed

+86
-89
lines changed

docs/make.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ makedocs(;
2828
sitename="Pigeons.jl",
2929
strict=true,
3030
format=Documenter.HTML(;
31-
prettyurls=get(ENV, "CI", "false") == "true",
31+
prettyurls=true, # always on, avoids confusion when building locally. If needed, serve the "build" folder locally with LiveServer. #get(ENV, "CI", "false") == "true",
3232
canonical="https://Julia-Tempering.github.io/Pigeons.jl",
3333
edit_link="main",
3434
assets=String[],

docs/src/correctness.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ CurrentModule = Pigeons
77
It is notoriously difficult to implement correct parallel/distributed algorithms.
88
One strategy we use to address this is to guarantee that the code will output
99
precisely the same output no matter how many threads/machines are used.
10-
We describe how this is done under the hood in the page [Distributed PT](distributed.html).
10+
We describe how this is done under the hood in the page [Distributed PT](@ref distributed).
1111

1212
In practice, how is this useful? Let us say you developed a new target and you would like
1313
to make sure that it works correctly in a multi-threaded environment. To do so, add a flag to indicate to "check" one of the PT rounds as follows, and

docs/src/distributed.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Distributed and parallel implementation of PT
5+
# [Distributed and parallel implementation of PT](@id distributed)
66

77
## Introduction
88

@@ -15,7 +15,7 @@ parallelized, and randomized algorithm.
1515
Read this page if you are interested in extending Pigeons or
1616
understanding how it works under the hood.
1717
Reading this page is not required to use Pigeons. Instead, refer to the
18-
[user guide](index.html).
18+
[user guide](@ref index).
1919

2020
In Distributed PT, one or several computers run MCMC simulations in parallel and
2121
communicate with each other to improve MCMC efficiency.
@@ -79,7 +79,7 @@ Let us start with a high-level picture of the distributed PT algorithm.
7979
The high-level code is the function [`pigeons()`](@ref) which is identical to the single-machine algorithm.
8080
A first difference lay in the [`replicas`](@ref) datastructure taking on a different type. Also, as promised the
8181
output is identical despite a vastly different swap logic: this can be checked using the `checked_round`
82-
argument described in the [user guide](index.html).
82+
argument described in the [user guide](@ref index).
8383
A second difference between the execution of [`pigeons()`](@ref) in single vs many machine context is the behaviour
8484
of [`swap!`](@ref) which is dispatched
8585
based on the type of

docs/src/index.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,14 +2,14 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Pigeons
5+
# [Pigeons](@id index)
66

77
## Summary
88

99
`Pigeons` is a Julia package to approximate challenging posterior distributions, and more broadly, Lebesgue integration problems. Pigeons can be used in a multi-threaded context, and/or distributed over hundreds or thousands of MPI-communicating machines.
1010

11-
Pigeons supports many [different ways to specify integration/expectation problems](input-overview.html) and
12-
provides [rich and configurable output](output-overview.html).
11+
Pigeons supports many [different ways to specify integration/expectation problems](@ref input-overview) and
12+
provides [rich and configurable output](@ref output-overview).
1313

1414
Pigeons' core algorithm is a distributed and parallel implementation
1515
of the following algorithms:
@@ -22,7 +22,7 @@ These algorithms achieve state-of-the-art performance for approximation
2222
of challenging probability distributions.
2323

2424

25-
## Installing Pigeons
25+
## [Installing Pigeons](@id installing-pigeons)
2626

2727
1. If you have not done so, install [Julia](https://julialang.org/downloads/). Julia 1.8 and higher are supported.
2828
2. Install `Pigeons` using

docs/src/input-explorers.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Custom explorers
5+
# [Custom explorers](@id input-explorers)
66

77
Pigeons have several built-in [`explorer`](@ref) kernels such as
88
[`AutoMALA`](@ref) and a [`SliceSampler`](@ref).
@@ -37,8 +37,8 @@ Pigeons.initialization(::MyLogPotential, ::AbstractRNG, ::Int) = [0.5, 0.5]
3737

3838
We show how create a new explorer,
3939
for pedagogy, a simple [independence Metropolis algorithm](https://bookdown.org/rdpeng/advstatcomp/metropolis-hastings.html#independence-metropolis-algorithm), applied to
40-
our familiar [unidentifiable toy example](unidentifiable-example.html),
41-
based on [Julia black-box implementation](input-julia.html).
40+
our familiar [unidentifiable toy example](@ref unidentifiable-example),
41+
based on [Julia black-box implementation](@ref input-julia).
4242

4343
```@example explorer
4444
struct MyIndependenceSampler

docs/src/input-julia.md

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,15 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Julia code as input to pigeons
5+
# [Julia code as input to pigeons](@id input-julia)
66

77
In typical Bayesian statistics applications, it is
88
easiest to specify the model in a modelling language,
99
such as Turing, but sometimes to get more flexibility or
1010
speed it is useful to implement the density evaluation
1111
manually as a "black-box" Julia function.
1212

13-
Here we show how this is done using our familiar [unidentifiable toy example](unidentifiable-example.html)
13+
Here we show how this is done using our familiar [unidentifiable toy example](@ref unidentifiable-example)
1414
[ported to the Stan language](https://github.com/Julia-Tempering/Pigeons.jl/blob/main/examples/stan/unid.stan).
1515

1616
We first create a custom type, `MyLogPotential` to control dispatch on the interface [`target`](@ref).
@@ -119,14 +119,13 @@ Pigeons have several built-in [`explorer`](@ref) kernels such as
119119
However when the state space is neither the reals nor the integers,
120120
or for performance reasons, it may be necessary to create custom
121121
exploration MCMC kernels.
122-
This is described on the [custom explorers page](input-explorers.html).
122+
This is described on the [custom explorers page](@ref input-explorers).
123123

124124

125125
## Manipulating the output
126126

127127
Some
128-
common post-processing are shown below, see [the section on output processing for more information](output-overview
129-
.html).
128+
common post-processing are shown below, see [the section on output processing for more information](@ref output-overview).
130129

131130
```@example julia
132131
using MCMCChains
@@ -146,7 +145,7 @@ samples
146145
```
147146

148147
```@raw html
149-
<iframe src="julia_posterior_densities_and_traces.html" style="height:500px;width:100%;"></iframe>
148+
<iframe src="../julia_posterior_densities_and_traces.html" style="height:500px;width:100%;"></iframe>
150149
```
151150

152151

docs/src/input-nonjulian.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Targeting a non-Julian model
5+
# [Targeting a non-Julian model](@id input-nonjulian)
66

77
Suppose you have some code implementing vanilla MCMC, written in an arbitrary "foreign" language such as C++, Python, R, Java, etc. You would like to turn this vanilla MCMC code into a Parallel Tempering algorithm able to harness large numbers of cores, including distributing this algorithm over MPI. However, you do not wish to learn anything about MPI/multi-threading/Parallel Tempering.
88

@@ -48,7 +48,7 @@ Pigeons.setup_blang("blangDemos")
4848

4949
Next, we run a
5050
[Blang implementation](https://github.com/UBC-Stat-ML/blangDemos/blob/master/src/main/java/demos/UnidentifiableProduct.bl) of
51-
our usual [unidentifiable toy example](unidentifiable-example.html):
51+
our usual [unidentifiable toy example](@ref unidentifiable-example):
5252

5353
```@example blang
5454
using Pigeons

docs/src/input-overview.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -2,18 +2,18 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Overview: inputting an integral/expectation problem into pigeons
5+
# [Overview: inputting an integral/expectation problem into pigeons](@id input-overview)
66

77
Pigeons takes as input an expectation or integration problem.
88
Pigeons supports a wide range of methods for specifying the input problem,
99
described in the pages below.
1010

11-
- [Turing.jl model](input-turing.html): a succinct specification of a joint distribution from which a posterior (target) and prior (reference) are extracted.
12-
- [Black-box Julia function](input-julia.html): less automated, but more general and fully configurable.
13-
- [Stan model](input-stan.html): a convenient adaptor for the most popular Bayesian modelling language.
14-
- [MCMC code implemented in another language](input-nonjulian.html): bridging your MCMC code to pigeons to make it distributed and parallel.
15-
- [Customize the MCMC explorers used by PT](input-explorers.html).
11+
- [Turing.jl model](@ref input-turing): a succinct specification of a joint distribution from which a posterior (target) and prior (reference) are extracted.
12+
- [Black-box Julia function](@ref input-julia): less automated, but more general and fully configurable.
13+
- [Stan model](@ref input-stan): a convenient adaptor for the most popular Bayesian modelling language.
14+
- [MCMC code implemented in another language](@ref input-nonjulian): bridging your MCMC code to pigeons to make it distributed and parallel.
15+
- [Customize the MCMC explorers used by PT](@ref input-explorers).
1616

1717
We exemplify these different input methods on a recurrent example:
1818
an unidentifiable toy model,
19-
see [the page describing the recurrent example in more details](unidentifiable-example.html).
19+
see [the page describing the recurrent example in more details](@ref unidentifiable-example).

docs/src/input-stan.md

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Stan model as input to pigeons
5+
# [Stan model as input to pigeons](@id input-stan)
66

77
!!! note
88

@@ -17,7 +17,7 @@ To target the posterior distribution specified by
1717
a [Stan](https://mc-stan.org/) model, use
1818
a [`StanLogPotential`](@ref).
1919

20-
Here we show how this is done using our familiar [unidentifiable toy example](unidentifiable-example.html)
20+
Here we show how this is done using our familiar [unidentifiable toy example](@ref unidentifiable-example)
2121
[ported to the Stan language](https://github.com/Julia-Tempering/Pigeons.jl/blob/main/examples/stan/unid.stan).
2222

2323
```@example stan
@@ -85,8 +85,7 @@ However, sample post-processing functions such as [`sample_array()`](@ref) and [
8585
convert back to the original ("constrained") parameterization via [`extract_sample()`](@ref).
8686

8787
As a result parameterization issues can be essentially ignored when post-processing, for example some
88-
common post-processing are shown below, see [the section on output processing for more information](output-overview
89-
.html).
88+
common post-processing are shown below, see [the section on output processing for more information](@ref output-overview).
9089

9190
```@example stan
9291
using MCMCChains
@@ -105,7 +104,7 @@ samples
105104
```
106105

107106
```@raw html
108-
<iframe src="stan_posterior_densities_and_traces.html" style="height:500px;width:100%;"></iframe>
107+
<iframe src="../stan_posterior_densities_and_traces.html" style="height:500px;width:100%;"></iframe>
109108
```
110109

111110

docs/src/input-turing.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Turing.jl model as input to pigeons
5+
# [Turing.jl model as input to pigeons](@id input-turing)
66

77
To target the posterior distribution specified by
88
a [Turing.jl](https://github.com/TuringLang/Turing.jl) model use
@@ -37,8 +37,7 @@ However, sample post-processing functions such as [`sample_array()`](@ref) and [
3737
convert back to the original ("constrained") parameterization via [`extract_sample()`](@ref).
3838

3939
As a result parameterization issues can be essentially ignored when post-processing, for example some
40-
common post-processing are shown below, see [the section on output processing for more information](output-overview
41-
.html).
40+
common post-processing are shown below, see [the section on output processing for more information](@ref output-overview).
4241

4342
```@example turing
4443
using MCMCChains
@@ -56,6 +55,6 @@ samples
5655
```
5756

5857
```@raw html
59-
<iframe src="turing_posterior_densities_and_traces.html" style="height:500px;width:100%;"></iframe>
58+
<iframe src="../turing_posterior_densities_and_traces.html" style="height:500px;width:100%;"></iframe>
6059
```
6160

docs/src/mpi.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ Create an issue if you would like another submission system included.
4444

4545
Follow these instructions to run MPI over several machines:
4646

47-
1. In the cluster login node, follow the [local installation instructions](index.html).
47+
1. In the cluster login node, follow the [local installation instructions](@ref installing-pigeons).
4848
2. Start Julia in the login node, and perform a one-time setup by calling [`setup_mpi()`](@ref). Its argument are the fields in [`MPISettings`](@ref), see the documentation there for details.
4949
3. Still in the Julia REPL running in the login node, use:
5050

@@ -77,4 +77,4 @@ and cancel/kill a job using
7777
kill_job(mpi_run)
7878
```
7979

80-
To analyze the output, see the documentation page on [post-processing for MPI runs](output-mpi-postprocessing.html).
80+
To analyze the output, see the documentation page on [post-processing for MPI runs](@ref output-mpi-postprocessing).

docs/src/output-custom-types.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Output for custom types
5+
# [Output for custom types](@id output-custom-types)
66

77
The [`sample_array`](@ref) function assumes that the variables are real or integer (the latter coerced into the former)
88
and "flattened" into a uniform array.
@@ -21,5 +21,5 @@ vector = get_sample(pt)
2121
length(vector) # = number of iterations = 2^10
2222
```
2323

24-
Another option is to use [off-memory processing](output-off-memory.html) which makes no assumption
24+
Another option is to use [off-memory processing](@ref output-off-memory) which makes no assumption
2525
either on the type of each individual sample.

docs/src/output-mpi-postprocessing.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ CurrentModule = Pigeons
33
```
44

55

6-
# Post-processing for MPI runs (plotting, summaries, etc)
6+
# [Post-processing for MPI runs (plotting, summaries, etc)](@id output-mpi-postprocessing)
77

88
Two options are available to post-process samples produced from
99
MPI runs: (1) loading
@@ -27,12 +27,12 @@ This will load the information distributed across several machines
2727
into the interactive node.
2828

2929
Once you have a [`PT`](@ref) struct, proceed in the same way as
30-
when running PT locally, e.g. [see the page on plotting](output-plotting.html),
31-
[the page on online statistics](output-online.html),
32-
and [the page on sample summaries and diagnostics](summaries.html).
30+
when running PT locally, e.g. [see the page on plotting](@ref output-plotting),
31+
[the page on online statistics](@ref output-online),
32+
and [the page on sample summaries and diagnostics](@ref output-numerical).
3333

3434
For example, here is how to modify the posterior density and trace plot
35-
example from [the plotting page](output-plotting.html) to run as a local MPI job
35+
example from [the plotting page](@ref output-plotting) to run as a local MPI job
3636
instead of in-process (the lines differing from the local version are marked
3737
with (*)):
3838

@@ -70,7 +70,7 @@ nothing # hide
7070
```
7171

7272
```@raw html
73-
<iframe src="mpi_posterior_densities_and_traces.html" style="height:500px;width:100%;"></iframe>
73+
<iframe src="../mpi_posterior_densities_and_traces.html" style="height:500px;width:100%;"></iframe>
7474
```
7575

7676
# Perform post-processing by loading samples from disk one at a time
@@ -113,5 +113,5 @@ nothing # hide
113113
```
114114

115115
```@raw html
116-
<iframe src="first_dim_of_each.html" style="height:500px;width:100%;"></iframe>
116+
<iframe src="../first_dim_of_each.html" style="height:500px;width:100%;"></iframe>
117117
```

docs/src/output-normalization.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Approximation of the normalization constant
5+
# [Approximation of the normalization constant](@id output-normalization)
66

77
## Background
88

@@ -22,7 +22,7 @@ In many applications, it is useful to approximate the constant ``Z``. For examp
2222
As a side-product of parallel tempering, we automatically obtain an approximate the natural logarithm of the normalization constant ``\log Z``. This is done automatically using the
2323
[stepping stone estimator](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3038348/) computed in [`stepping_stone()`](@ref).
2424

25-
It is shown in the [standard output report](output-report.html) produced at each round:
25+
It is shown in the [standard output report](@ref output-reports) produced at each round:
2626

2727
```@example constants
2828
using Pigeons
@@ -48,7 +48,7 @@ log of the *ratio*, ``\log (Z_1/ Z_0)`` where ``Z_1`` and ``Z_0`` are the normal
4848

4949
Hence to estimate ``\log Z_1`` the reference distribution ``\pi_1`` should have a known normalization constant. In cases where the reference is a proper prior distribution, for example in Turing.jl models, this is typically the case.
5050

51-
In scenarios where the reference is specified manually, e.g. for black-box functions or Stan models, more care is needed. In such cases, one alternative is to use [variational PT](`variational.html`) in which case the built-in variational distribution is constructed so that its normalization constant is one.
51+
In scenarios where the reference is specified manually, e.g. for black-box functions or Stan models, more care is needed. In such cases, one alternative is to use [variational PT](@ref variational-pt) in which case the built-in variational distribution is constructed so that its normalization constant is one.
5252

5353
!!! note "Normalization of Stan models"
5454

docs/src/output-numerical.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Numerical outputs and diagnostics
5+
# [Numerical outputs and diagnostics](@id output-numerical)
66

77
Use [`sample_array()`](@ref) to convert target chain
88
samples into a format that can then be consumed by the
@@ -44,7 +44,7 @@ samples
4444
## Accessing individual diagnostics and summaries
4545

4646
Computing a mean
47-
(but see [online statistics](output-online.html) for
47+
(but see [online statistics](@ref output-online) for
4848
a constant memory alternative):
4949

5050
```example numerical

docs/src/output-off-memory.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,20 +2,20 @@
22
CurrentModule = Pigeons
33
```
44

5-
# Off-memory processing
5+
# [Off-memory processing](@id output-off-memory)
66

77
When the dimensionality of a model is large and/or the
88
number of MCMC samples is large, the samples may not
99
fit in memory.
1010
In some situation, it may be possible to compute the
1111
output in finite memory, as described in
12-
[the online statistics documentation page](output-online.html).
12+
[the online statistics documentation page](@ref output-online).
1313
However not all situations admit sufficient statistics and
1414
in this case it is necessary to store samples to disk.
1515
We show here how to do so when pigeons is ran on a single
1616
machine, but the interface is similar over MPI and
1717
described in the
18-
[MPI sample processing documentation page](output-mpi-postprocessing.html).
18+
[MPI sample processing documentation page](@ref output-mpi-postprocessing).
1919

2020

2121
## Prepare the PT run with the disk recorder

0 commit comments

Comments
 (0)