Skip to content

Commit ab92409

Browse files
committed
Fix more imports
1 parent 8c5f64a commit ab92409

File tree

5 files changed

+16
-12
lines changed

5 files changed

+16
-12
lines changed

developers/compiler/minituring-contexts/index.qmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -294,7 +294,7 @@ Of course, using an MCMC algorithm to sample from the prior is unnecessary and s
294294
The use of contexts also goes far beyond just evaluating log probabilities and sampling. Some examples from Turing are
295295

296296
* `FixedContext`, which fixes some variables to given values and removes them completely from the evaluation of any log probabilities. They power the `Turing.fix` and `Turing.unfix` functions.
297-
* `ConditionContext` conditions the model on fixed values for some parameters. They are used by `Turing.condition` and `Turing.uncondition`, i.e. the `model | (parameter=value,)` syntax. The difference between `fix` and `condition` is whether the log probability for the corresponding variable is included in the overall log density.
297+
* `ConditionContext` conditions the model on fixed values for some parameters. They are used by `Turing.condition` and `Turing.decondition`, i.e. the `model | (parameter=value,)` syntax. The difference between `fix` and `condition` is whether the log probability for the corresponding variable is included in the overall log density.
298298

299299
* `PriorExtractorContext` collects information about what the prior distribution of each variable is.
300300
* `PrefixContext` adds prefixes to variable names, allowing models to be used within other models without variable name collisions.

developers/compiler/model-manual/index.qmd

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -36,26 +36,26 @@ using DynamicPPL
3636
function gdemo2(model, varinfo, context, x)
3737
# Assume s² has an InverseGamma distribution.
3838
s², varinfo = DynamicPPL.tilde_assume!!(
39-
context, InverseGamma(2, 3), Turing.@varname(s²), varinfo
39+
context, InverseGamma(2, 3), @varname(s²), varinfo
4040
)
4141
4242
# Assume m has a Normal distribution.
4343
m, varinfo = DynamicPPL.tilde_assume!!(
44-
context, Normal(0, sqrt(s²)), Turing.@varname(m), varinfo
44+
context, Normal(0, sqrt(s²)), @varname(m), varinfo
4545
)
4646
4747
# Observe each value of x[i] according to a Normal distribution.
4848
for i in eachindex(x)
4949
_retval, varinfo = DynamicPPL.tilde_observe!!(
50-
context, Normal(m, sqrt(s²)), x[i], Turing.@varname(x[i]), varinfo
50+
context, Normal(m, sqrt(s²)), x[i], @varname(x[i]), varinfo
5151
)
5252
end
5353
5454
# The final return statement should comprise both the original return
5555
# value and the updated varinfo.
5656
return nothing, varinfo
5757
end
58-
gdemo2(x) = Turing.Model(gdemo2, (; x))
58+
gdemo2(x) = DynamicPPL.Model(gdemo2, (; x))
5959
6060
# Instantiate a Model object with our data variables.
6161
model2 = gdemo2([1.5, 2.0])

developers/inference/implementing-samplers/index.qmd

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -403,11 +403,11 @@ As we promised, all of this hassle of implementing our `MALA` sampler in a way t
403403
It also enables use with Turing.jl through the `externalsampler`, but we need to do one final thing first: we need to tell Turing.jl how to extract a vector of parameters from the "sample" returned in our implementation of `AbstractMCMC.step`. In our case, the "sample" is a `MALASample`, so we just need the following line:
404404

405405
```{julia}
406-
# Load Turing.jl.
407406
using Turing
407+
using DynamicPPL
408408
409409
# Overload the `getparams` method for our "sample" type, which is just a vector.
410-
Turing.Inference.getparams(::Turing.Model, sample::MALASample) = sample.x
410+
Turing.Inference.getparams(::DynamicPPL.Model, sample::MALASample) = sample.x
411411
```
412412

413413
And with that, we're good to go!

tutorials/bayesian-time-series-analysis/index.qmd

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -165,6 +165,8 @@ scatter!(t, yf; color=2, label="Data")
165165
With the model specified and with a reasonable prior we can now let Turing decompose the time series for us!
166166

167167
```{julia}
168+
using MCMCChains: get_sections
169+
168170
function mean_ribbon(samples)
169171
qs = quantile(samples)
170172
low = qs[:, Symbol("2.5%")]
@@ -174,7 +176,7 @@ function mean_ribbon(samples)
174176
end
175177
176178
function get_decomposition(model, x, cyclic_features, chain, op)
177-
chain_params = Turing.MCMCChains.get_sections(chain, :parameters)
179+
chain_params = get_sections(chain, :parameters)
178180
return returned(model(x, cyclic_features, op), chain_params)
179181
end
180182

tutorials/variational-inference/index.qmd

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ Here we will focus on how to use VI in Turing and not much on the theory underly
1818
If you are interested in understanding the mathematics you can checkout [our write-up]({{<meta using-turing-variational-inference>}}) or any other resource online (there a lot of great ones).
1919

2020
Using VI in Turing.jl is very straight forward.
21-
If `model` denotes a definition of a `Turing.Model`, performing VI is as simple as
21+
If `model` denotes a definition of a `DynamicPPL.Model`, performing VI is as simple as
2222

2323
```{julia}
2424
#| eval: false
@@ -54,7 +54,7 @@ x_i &\overset{\text{i.i.d.}}{=} \mathcal{N}(m, s), \quad i = 1, \dots, n
5454

5555
Recall that *conjugate* refers to the fact that we can obtain a closed-form expression for the posterior. Of course one wouldn't use something like variational inference for a conjugate model, but it's useful as a simple demonstration as we can compare the result to the true posterior.
5656

57-
First we generate some synthetic data, define the `Turing.Model` and instantiate the model on the data:
57+
First we generate some synthetic data, define the `DynamicPPL.Model` and instantiate the model on the data:
5858

5959
```{julia}
6060
# generate data
@@ -666,11 +666,13 @@ using Bijectors: Scale, Shift
666666
```
667667

668668
```{julia}
669+
using DistributionsAD
670+
669671
d = length(q)
670-
base_dist = Turing.DistributionsAD.TuringDiagMvNormal(zeros(d), ones(d))
672+
base_dist = DistributionsAD.TuringDiagMvNormal(zeros(d), ones(d))
671673
```
672674

673-
`bijector(model::Turing.Model)` is defined by Turing, and will return a `bijector` which takes you from the space of the latent variables to the real space. In this particular case, this is a mapping `((0, ∞) × ℝ × ℝ¹⁰) → ℝ¹²`. We're interested in using a normal distribution as a base-distribution and transform samples to the latent space, thus we need the inverse mapping from the reals to the latent space:
675+
`bijector(model::DynamicPPL.Model)` is defined in DynamicPPL, and will return a `bijector` which takes you from the space of the latent variables to the real space. In this particular case, this is a mapping `((0, ∞) × ℝ × ℝ¹⁰) → ℝ¹²`. We're interested in using a normal distribution as a base-distribution and transform samples to the latent space, thus we need the inverse mapping from the reals to the latent space:
674676

675677
```{julia}
676678
to_constrained = inverse(bijector(m));

0 commit comments

Comments
 (0)