Skip to content

Update to the AdvancedVI@0.4 interface #2506

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 69 commits into from
Jun 3, 2025
Merged
Show file tree
Hide file tree
Changes from 33 commits
Commits
Show all changes
69 commits
Select commit Hold shift + click to select a range
ed6946c
update to match the AdvancedVI@0.3 interface
Red-Portal Mar 14, 2025
a94269d
run formatter
Red-Portal Mar 14, 2025
a4711a9
run formatter
Red-Portal Mar 14, 2025
3f8068b
run formatter
Red-Portal Mar 14, 2025
222a638
run formatter
Red-Portal Mar 14, 2025
57097f5
run formatter
Red-Portal Mar 14, 2025
a42eea8
run formatter
Red-Portal Mar 14, 2025
798f319
run formatter
Red-Portal Mar 14, 2025
69a4972
run formatter
Red-Portal Mar 14, 2025
cbcb8b5
run formatter
Red-Portal Mar 14, 2025
081d6ff
remove plotting
Red-Portal Mar 14, 2025
a32a673
Merge branch 'update_advancedvi' of github.com:TuringLang/Turing.jl i…
Red-Portal Mar 14, 2025
1bcec3e
fix formatting
Red-Portal Mar 14, 2025
b142832
fix formatting
Red-Portal Mar 14, 2025
061ec35
fix formatting
Red-Portal Mar 14, 2025
736bd3e
remove unused dependency
Red-Portal Mar 14, 2025
fd434d8
Merge branch 'update_advancedvi' of github.com:TuringLang/Turing.jl i…
Red-Portal Mar 14, 2025
57108ee
Merge branch 'main' into update_advancedvi
yebai Mar 18, 2025
8dc8067
Merge branch 'main' into update_advancedvi
yebai Mar 20, 2025
297c32a
Update Project.toml
yebai Mar 20, 2025
3010b5e
Merge branch 'main' of github.com:TuringLang/Turing.jl into update_ad…
Red-Portal Mar 25, 2025
0c04434
fix make some arugments of vi initializer to be optional kwargs
Red-Portal Mar 25, 2025
17a8290
Merge branch 'update_advancedvi' of github.com:TuringLang/Turing.jl i…
Red-Portal Mar 25, 2025
626c5b5
remove tests for custom optimizers
Red-Portal Mar 25, 2025
cb2c618
remove unused file
Red-Portal Mar 25, 2025
2b08a4b
Merge branch 'main' of github.com:TuringLang/Turing.jl into update_ad…
Red-Portal Mar 29, 2025
0e496c4
Merge branch 'main' into update_advancedvi
yebai Apr 18, 2025
c1533a8
Update src/variational/bijectors.jl
yebai Apr 18, 2025
231d6e2
Update Turing.jl
yebai Apr 21, 2025
c2ae04a
Merge branch 'main' of github.com:TuringLang/Turing.jl into update_ad…
Red-Portal Apr 29, 2025
69639ec
fix remove call to `AdvancedVI.turnprogress`, which has been removed
Red-Portal Apr 29, 2025
ef9aeb1
apply comments from @yebai
Red-Portal Apr 29, 2025
43c19aa
Merge branch 'update_advancedvi' of github.com:TuringLang/Turing.jl i…
Red-Portal Apr 29, 2025
cc18528
Update src/variational/VariationalInference.jl
yebai May 8, 2025
162899a
Merge branch 'main' into update_advancedvi
yebai May 8, 2025
0b79495
add old interface as deprecated
Red-Portal May 14, 2025
3818152
bump AdvancedVI version
Red-Portal May 14, 2025
91a9afe
add deprecation for `meanfield`
Red-Portal May 14, 2025
12539aa
add `default_rng` interfaces
Red-Portal May 14, 2025
0653bf1
add tests for variational inference
Red-Portal May 14, 2025
f74ec38
run formatter
Red-Portal May 14, 2025
406824f
Merge branch 'main' of github.com:TuringLang/Turing.jl into update_ad…
Red-Portal May 14, 2025
f62e7b8
remove "src/variational/bijectors.jl" (moved to `DynamicPPL.jl`)
Red-Portal May 18, 2025
e3b7618
Merge remote-tracking branch 'origin/main' into update_advancedvi
yebai May 21, 2025
f0374b6
add more tests for variational inference initializer
Red-Portal May 23, 2025
187a65c
remove non-essential reexports, fix tests
Red-Portal May 23, 2025
a5021d1
run formatter, rename functions
Red-Portal May 23, 2025
218eb23
add documentation
Red-Portal May 23, 2025
4714c3c
fix run formatter
Red-Portal May 23, 2025
f712755
fix remove debug commits
Red-Portal May 23, 2025
8086398
run formatter
Red-Portal May 24, 2025
37f6b06
run formatter
Red-Portal May 24, 2025
c717220
run formatter
Red-Portal May 24, 2025
e9f7f1e
add Variational submodule
Red-Portal May 24, 2025
6a8c6ed
fix docstring style
Red-Portal May 24, 2025
c4d73fb
update docstring style
Red-Portal May 26, 2025
feb1a57
format docstring style
Red-Portal May 26, 2025
ea417fc
Merge branch 'breaking' into update_advancedvi
penelopeysm May 28, 2025
4c9a538
fix typo
Red-Portal May 30, 2025
dfa8d20
fix use fixed seed with StableRNGs
Red-Portal May 30, 2025
a18f581
fix export variational families
Red-Portal May 30, 2025
f9528e0
fix forma
Red-Portal May 30, 2025
fb150c7
Merge branch 'main' into update_advancedvi
yebai May 30, 2025
8174725
Merge remote-tracking branch 'origin/breaking' into update_advancedvi
penelopeysm Jun 2, 2025
dec108b
update changelog for advancedvi 0.4
Red-Portal Jun 2, 2025
b0d791e
fix version number
Red-Portal Jun 2, 2025
29373ee
Format & add some links
penelopeysm Jun 2, 2025
d21e652
Merge branch 'breaking' into update_advancedvi
Red-Portal Jun 3, 2025
4c72501
fix formatting
Red-Portal Jun 3, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Accessors = "0.1"
AdvancedHMC = "0.3.0, 0.4.0, 0.5.2, 0.6, 0.7"
AdvancedMH = "0.8"
AdvancedPS = "0.6.0"
AdvancedVI = "0.2"
AdvancedVI = "0.3.1"
BangBang = "0.4.2"
Bijectors = "0.14, 0.15"
Compat = "4.15.0"
Expand Down
2 changes: 0 additions & 2 deletions src/Turing.jl
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,6 @@ function setprogress!(progress::Bool)
@info "[Turing]: progress logging is $(progress ? "enabled" : "disabled") globally"
PROGRESS[] = progress
AbstractMCMC.setprogress!(progress; silent=true)
# TODO: `AdvancedVI.turnprogress` is removed in AdvancedVI v0.3
AdvancedVI.turnprogress(progress)
return progress
end

Expand Down
159 changes: 129 additions & 30 deletions src/variational/VariationalInference.jl
Original file line number Diff line number Diff line change
@@ -1,50 +1,149 @@

module Variational

using DistributionsAD: DistributionsAD
using DynamicPPL: DynamicPPL
using StatsBase: StatsBase
using StatsFuns: StatsFuns
using LogDensityProblems: LogDensityProblems
using DynamicPPL
using ADTypes
using Distributions
using LinearAlgebra
using LogDensityProblems
using Random

using Random: Random
import ..Turing: DEFAULT_ADTYPE, PROGRESS

import AdvancedVI
import Bijectors

# Reexports
using AdvancedVI: vi, ADVI, ELBO, elbo, TruncatedADAGrad, DecayedADAGrad
export vi, ADVI, ELBO, elbo, TruncatedADAGrad, DecayedADAGrad

"""
make_logjoint(model::Model; weight = 1.0)
Constructs the logjoint as a function of latent variables, i.e. the map z → p(x ∣ z) p(z).
The weight used to scale the likelihood, e.g. when doing stochastic gradient descent one needs to
use `DynamicPPL.MiniBatch` context to run the `Model` with a weight `num_total_obs / batch_size`.
## Notes
- For sake of efficiency, the returned function is closes over an instance of `VarInfo`. This means that you *might* run into some weird behaviour if you call this method sequentially using different types; if that's the case, just generate a new one for each type using `make_logjoint`.
"""
function make_logjoint(model::DynamicPPL.Model; weight=1.0)
# setup
using AdvancedVI: RepGradELBO, ScoreGradELBO, DoG, DoWG
export RepGradELBO, ScoreGradELBO, DoG, DoWG

export vi, q_init, q_meanfield_gaussian, q_fullrank_gaussian

include("bijectors.jl")

function make_logdensity(model::DynamicPPL.Model)
weight = 1.0

Check warning on line 25 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L24-L25

Added lines #L24 - L25 were not covered by tests
ctx = DynamicPPL.MiniBatchContext(DynamicPPL.DefaultContext(), weight)
f = DynamicPPL.LogDensityFunction(model, DynamicPPL.VarInfo(model), ctx)
return Base.Fix1(LogDensityProblems.logdensity, f)
return DynamicPPL.LogDensityFunction(model, DynamicPPL.VarInfo(model), ctx)

Check warning on line 27 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L27

Added line #L27 was not covered by tests
end

# objectives
function (elbo::ELBO)(
function initialize_gaussian_scale(

Check warning on line 30 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L30

Added line #L30 was not covered by tests
rng::Random.AbstractRNG,
alg::AdvancedVI.VariationalInference,
q,
model::DynamicPPL.Model,
num_samples;
weight=1.0,
location::AbstractVector,
scale::AbstractMatrix;
num_samples::Int=10,
num_max_trials::Int=10,
reduce_factor=one(eltype(scale)) / 2,
)
prob = make_logdensity(model)
ℓπ = Base.Fix1(LogDensityProblems.logdensity, prob)
varinfo = DynamicPPL.VarInfo(model)

Check warning on line 41 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L39-L41

Added lines #L39 - L41 were not covered by tests

n_trial = 0
while true
q = AdvancedVI.MvLocationScale(location, scale, Normal())
b = Bijectors.bijector(model; varinfo=varinfo)
q_trans = Bijectors.transformed(q, Bijectors.inverse(b))
energy = mean(ℓπ, eachcol(rand(rng, q_trans, num_samples)))

Check warning on line 48 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L43-L48

Added lines #L43 - L48 were not covered by tests

if isfinite(energy)
return scale
elseif n_trial == num_max_trials
error("Could not find an initial")

Check warning on line 53 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L50-L53

Added lines #L50 - L53 were not covered by tests
end

scale = reduce_factor * scale
n_trial += 1
end

Check warning on line 58 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L56-L58

Added lines #L56 - L58 were not covered by tests
end

function q_init(

Check warning on line 61 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L61

Added line #L61 was not covered by tests
rng::Random.AbstractRNG,
model::DynamicPPL.Model;
location::Union{Nothing,<:AbstractVector}=nothing,
scale::Union{Nothing,<:Diagonal,<:LowerTriangular}=nothing,
meanfield::Bool=true,
basedist::Distributions.UnivariateDistribution=Normal(),
kwargs...,
)
return elbo(rng, alg, q, make_logjoint(model; weight=weight), num_samples; kwargs...)
varinfo = DynamicPPL.VarInfo(model)

Check warning on line 70 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L70

Added line #L70 was not covered by tests
# Use linked `varinfo` to determine the correct number of parameters.
# TODO: Replace with `length` once this is implemented for `VarInfo`.
varinfo_linked = DynamicPPL.link(varinfo, model)
num_params = length(varinfo_linked[:])

Check warning on line 74 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L73-L74

Added lines #L73 - L74 were not covered by tests

μ = if isnothing(location)
zeros(num_params)

Check warning on line 77 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L76-L77

Added lines #L76 - L77 were not covered by tests
else
@assert length(location) == num_params "Length of the provided location vector, $(length(location)), does not match dimension of the target distribution, $(num_params)."
location

Check warning on line 80 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L79-L80

Added lines #L79 - L80 were not covered by tests
end

L = if isnothing(scale)
if meanfield
initialize_gaussian_scale(rng, model, μ, Diagonal(ones(num_params)); kwargs...)

Check warning on line 85 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L83-L85

Added lines #L83 - L85 were not covered by tests
else
L0 = LowerTriangular(Matrix{Float64}(I, num_params, num_params))
initialize_gaussian_scale(rng, model, μ, L0; kwargs...)

Check warning on line 88 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L87-L88

Added lines #L87 - L88 were not covered by tests
end
else
@assert size(scale) == (num_params, num_params) "Dimensions of the provided scale matrix, $(size(scale)), does not match the dimension of the target distribution, $(num_params)."
if meanfield
Diagonal(diag(scale))

Check warning on line 93 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L91-L93

Added lines #L91 - L93 were not covered by tests
else
scale

Check warning on line 95 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L95

Added line #L95 was not covered by tests
end
end
q = AdvancedVI.MvLocationScale(μ, L, basedist)
b = Bijectors.bijector(model; varinfo=varinfo)
return Bijectors.transformed(q, Bijectors.inverse(b))

Check warning on line 100 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L98-L100

Added lines #L98 - L100 were not covered by tests
end

# VI algorithms
include("advi.jl")
function q_meanfield_gaussian(

Check warning on line 103 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L103

Added line #L103 was not covered by tests
rng::Random.AbstractRNG,
model::DynamicPPL.Model;
location::Union{Nothing,<:AbstractVector}=nothing,
scale::Union{Nothing,<:Diagonal}=nothing,
kwargs...,
)
return q_init(rng, model; location, scale, meanfield=true, basedist=Normal(), kwargs...)

Check warning on line 110 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L110

Added line #L110 was not covered by tests
end

function q_fullrank_gaussian(

Check warning on line 113 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L113

Added line #L113 was not covered by tests
rng::Random.AbstractRNG,
model::DynamicPPL.Model;
location::Union{Nothing,<:AbstractVector}=nothing,
scale::Union{Nothing,<:LowerTriangular}=nothing,
kwargs...,
)
return q_init(rng, model; location, scale, meanfield=false, basedist=Normal(), kwargs...)

Check warning on line 120 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L120

Added line #L120 was not covered by tests
end

function vi(

Check warning on line 123 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L123

Added line #L123 was not covered by tests
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function is a thin wrapper around AdvancedVI.optimize. I'd suggest we consider #2509 (comment) for this interface to make it future-proof:

optimise(model, VI(q, n_iterations, objective; ...), ...)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given that the function vi itself is more of a legacy from the v0.2 days, I suggest we first go with this and add the new interface in the future? Whatever we decide to do, I think having vi now won't be too much of a hassle in the future. (On that matter, I am sympathetic to the proposal to unify everything into infer)

model::DynamicPPL.Model,
q::Bijectors.TransformedDistribution,
n_iterations::Int;
objective=RepGradELBO(10; entropy=AdvancedVI.ClosedFormEntropyZeroGradient()),
show_progress::Bool=PROGRESS[],
optimizer=AdvancedVI.DoWG(),
averager=AdvancedVI.PolynomialAveraging(),
operator=AdvancedVI.ProximalLocationScaleEntropy(),
adtype::ADTypes.AbstractADType=DEFAULT_ADTYPE,
kwargs...,
)
return AdvancedVI.optimize(

Check warning on line 135 in src/variational/VariationalInference.jl

View check run for this annotation

Codecov / codecov/patch

src/variational/VariationalInference.jl#L135

Added line #L135 was not covered by tests
make_logdensity(model),
objective,
q,
n_iterations;
show_progress=show_progress,
adtype,
optimizer,
averager,
operator,
kwargs...,
)
end

end
140 changes: 0 additions & 140 deletions src/variational/advi.jl

This file was deleted.

Loading
Loading