Skip to content

Turing 0.37 #589

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Mar 25, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
692 changes: 339 additions & 353 deletions Manifest.toml

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -53,4 +53,4 @@ UnPack = "3a884ed6-31ef-47d7-9d2a-63182c4928ed"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

[compat]
Turing = "0.36.2"
Turing = "0.37"
2 changes: 1 addition & 1 deletion _quarto.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ website:
text: Team
right:
# Current version
- text: "v0.36"
- text: "v0.37"
menu:
- text: Changelog
href: https://turinglang.org/docs/changelog.html
Expand Down
10 changes: 8 additions & 2 deletions core-functionality/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -475,7 +475,10 @@ Example usage:
k = length(unique(g))
a ~ filldist(Exponential(), k) # = Product(fill(Exponential(), k))
mu = a[g]
return x .~ Normal.(mu)
for i in eachindex(x)
x[i] ~ Normal(mu[i])
end
return mu
end
```

Expand All @@ -491,7 +494,10 @@ Example usage:
k = length(unique(g))
a ~ arraydist([Exponential(i) for i in 1:k])
mu = a[g]
return x .~ Normal.(mu)
for i in eachindex(x)
x[i] ~ Normal(mu[i])
end
return mu
end
```

Expand Down
18 changes: 14 additions & 4 deletions developers/compiler/model-manual/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,9 @@ using Turing
m ~ Normal(0, sqrt(s²))

# Observe each value of x.
@. x ~ Normal(m, sqrt(s²))
x .~ Normal(m, sqrt(s²))

return nothing
end

model = gdemo([1.5, 2.0])
Expand All @@ -28,6 +30,8 @@ However, models can be constructed by hand without the use of a macro.
Taking the `gdemo` model above as an example, the macro-based definition can be implemented also (a bit less generally) with the macro-free version

```{julia}
using DynamicPPL

# Create the model function.
function gdemo2(model, varinfo, context, x)
# Assume s² has an InverseGamma distribution.
Expand All @@ -41,9 +45,15 @@ function gdemo2(model, varinfo, context, x)
)

# Observe each value of x[i] according to a Normal distribution.
return DynamicPPL.dot_tilde_observe!!(
context, Normal(m, sqrt(s²)), x, Turing.@varname(x), varinfo
)
for i in eachindex(x)
_retval, varinfo = DynamicPPL.tilde_observe!!(
context, Normal(m, sqrt(s²)), x[i], Turing.@varname(x[i]), varinfo
)
end

# The final return statement should comprise both the original return
# value and the updated varinfo.
return nothing, varinfo
end
gdemo2(x) = Turing.Model(gdemo2, (; x))

Expand Down
10 changes: 6 additions & 4 deletions tutorials/gaussian-mixture-models/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -167,10 +167,12 @@ One solution here is to enforce an ordering on our $\mu$ vector, requiring $\mu_
`Bijectors.jl` [provides](https://turinglang.org/Bijectors.jl/dev/transforms/#Bijectors.OrderedBijector) an easy transformation (`ordered()`) for this purpose:

```{julia}
using Bijectors: ordered

@model function gaussian_mixture_model_ordered(x)
# Draw the parameters for each of the K=2 clusters from a standard normal distribution.
K = 2
μ ~ Bijectors.ordered(MvNormal(Zeros(K), I))
μ ~ ordered(MvNormal(Zeros(K), I))
# Draw the weights for the K clusters from a Dirichlet distribution with parameters αₖ = 1.
w ~ Dirichlet(K, 1.0)
# Alternatively, one could use a fixed set of weights.
Expand Down Expand Up @@ -285,7 +287,7 @@ using LogExpFunctions
@model function gmm_marginalized(x)
K = 2
D, N = size(x)
μ ~ Bijectors.ordered(MvNormal(Zeros(K), I))
μ ~ ordered(MvNormal(Zeros(K), I))
w ~ Dirichlet(K, 1.0)
dists = [MvNormal(Fill(μₖ, D), I) for μₖ in μ]
for i in 1:N
Expand Down Expand Up @@ -325,7 +327,7 @@ The `logpdf` implementation for a `MixtureModel` distribution is exactly the mar
@model function gmm_marginalized(x)
K = 2
D, _ = size(x)
μ ~ Bijectors.ordered(MvNormal(Zeros(K), I))
μ ~ ordered(MvNormal(Zeros(K), I))
w ~ Dirichlet(K, 1.0)
x ~ MixtureModel([MvNormal(Fill(μₖ, D), I) for μₖ in μ], w)
end
Expand Down Expand Up @@ -381,7 +383,7 @@ end
@model function gmm_recover(x)
K = 2
D, N = size(x)
μ ~ Bijectors.ordered(MvNormal(Zeros(K), I))
μ ~ ordered(MvNormal(Zeros(K), I))
w ~ Dirichlet(K, 1.0)
dists = [MvNormal(Fill(μₖ, D), I) for μₖ in μ]
x ~ MixtureModel(dists, w)
Expand Down
2 changes: 1 addition & 1 deletion tutorials/hidden-markov-models/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ The priors on our transition matrix are noninformative, using `T[i] ~ Dirichlet(
N = length(y)

# State sequence.
s = tzeros(Int, N)
s = zeros(Int, N)

# Emission matrix.
m = Vector(undef, K)
Expand Down
4 changes: 2 additions & 2 deletions tutorials/infinite-mixture-models/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -188,10 +188,10 @@ In Turing we can implement an infinite Gaussian mixture model using the Chinese
H = Normal(μ0, σ0)

# Latent assignment.
z = tzeros(Int, length(x))
z = zeros(Int, length(x))

# Locations of the infinitely many clusters.
μ = tzeros(Float64, 0)
μ = zeros(Float64, 0)

for i in 1:length(x)

Expand Down
1 change: 1 addition & 0 deletions tutorials/variational-inference/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ We first import the packages to be used:
using Random
using Turing
using Turing: Variational
using Bijectors: bijector
using StatsPlots, Measures

Random.seed!(42);
Expand Down
6 changes: 2 additions & 4 deletions usage/probability-interface/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Let's use a simple model of normally-distributed data as an example.

```{julia}
using Turing
using LinearAlgebra: I
using DynamicPPL
using Random

@model function gdemo(n)
Expand Down Expand Up @@ -98,11 +98,9 @@ logjoint(model, sample)
```

For models with many variables `rand(model)` can be prohibitively slow since it returns a `NamedTuple` of samples from the prior distribution of the unconditioned variables.
We recommend working with samples of type `DataStructures.OrderedDict` in this case:
We recommend working with samples of type `DataStructures.OrderedDict` in this case (which Turing re-exports, so can be used directly):

```{julia}
using DataStructures: OrderedDict

Random.seed!(124)
sample_dict = rand(OrderedDict, model)
```
Expand Down
Loading