Skip to content

Commit e7c64c3

Browse files
Merge pull request #426 from gdalle/gd/adtypes_v1
Upgrade to ADTypes v1
2 parents 17bb8a8 + 844d78c commit e7c64c3

File tree

15 files changed

+133
-156
lines changed

15 files changed

+133
-156
lines changed

Project.toml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ NonlinearSolveSymbolicsExt = "Symbolics"
5757
NonlinearSolveZygoteExt = "Zygote"
5858

5959
[compat]
60-
ADTypes = "0.2.6"
60+
ADTypes = "1.1.0"
6161
Aqua = "0.8"
6262
ArrayInterface = "7.9"
6363
BandedMatrices = "1.5"
@@ -69,7 +69,7 @@ Enzyme = "0.11.15, 0.12"
6969
FastBroadcast = "0.2.8"
7070
FastClosures = "0.3.2"
7171
FastLevenbergMarquardt = "0.1"
72-
FiniteDiff = "2.21"
72+
FiniteDiff = "2.22"
7373
FixedPointAcceleration = "0.3"
7474
ForwardDiff = "0.10.36"
7575
LazyArrays = "1.8.2"
@@ -83,7 +83,7 @@ NLSolvers = "0.5"
8383
NLsolve = "4.5"
8484
NaNMath = "1"
8585
NonlinearProblemLibrary = "0.1.2"
86-
OrdinaryDiffEq = "6.74"
86+
OrdinaryDiffEq = "6.75"
8787
Pkg = "1.10"
8888
PrecompileTools = "1.2"
8989
Preferences = "1.4"
@@ -94,17 +94,17 @@ RecursiveArrayTools = "3.8"
9494
Reexport = "1.2"
9595
SIAMFANLEquations = "1.0.1"
9696
SafeTestsets = "0.1"
97-
SciMLBase = "2.28.0"
98-
SimpleNonlinearSolve = "1.2"
97+
SciMLBase = "2.34.0"
98+
SimpleNonlinearSolve = "1.8"
9999
SparseArrays = "1.10"
100-
SparseDiffTools = "2.17"
100+
SparseDiffTools = "2.19"
101101
SpeedMapping = "0.3"
102102
StableRNGs = "1"
103-
StaticArrays = "1.7"
103+
StaticArrays = "1.9"
104104
StaticArraysCore = "1.4"
105105
Sundials = "4.23.1"
106106
Symbolics = "5.13"
107-
SymbolicIndexingInterface = "0.3.3"
107+
SymbolicIndexingInterface = "0.3.15"
108108
Test = "1.10"
109109
TimerOutputs = "0.5.23"
110110
Zygote = "0.6.69"

docs/src/basics/autodiff.md

Lines changed: 7 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -3,19 +3,16 @@
33
## Summary of Finite Differencing Backends
44

55
- [`AutoFiniteDiff`](@ref): Finite differencing, not optimal but always applicable.
6-
- [`AutoSparseFiniteDiff`](@ref): Sparse version of [`AutoFiniteDiff`](@ref).
76

87
## Summary of Forward Mode AD Backends
98

109
- [`AutoForwardDiff`](@ref): The best choice for dense problems.
11-
- [`AutoSparseForwardDiff`](@ref): Sparse version of [`AutoForwardDiff`](@ref).
1210
- [`AutoPolyesterForwardDiff`](@ref): Might be faster than [`AutoForwardDiff`](@ref) for
1311
large problems. Requires `PolyesterForwardDiff.jl` to be installed and loaded.
1412

1513
## Summary of Reverse Mode AD Backends
1614

1715
- [`AutoZygote`](@ref): The fastest choice for non-mutating array-based (BLAS) functions.
18-
- [`AutoSparseZygote`](@ref): Sparse version of [`AutoZygote`](@ref).
1916
- [`AutoEnzyme`](@ref): Uses `Enzyme.jl` Reverse Mode and should be considered
2017
experimental.
2118

@@ -26,37 +23,38 @@
2623

2724
!!! note
2825

29-
The `Sparse` versions of the methods refers to automated sparsity detection. These
26+
The sparse versions of the methods refer to automated sparsity detection. These
3027
methods automatically discover the sparse Jacobian form from the function `f`. Note that
3128
all methods specialize the differentiation on a sparse Jacobian if the sparse Jacobian
3229
is given as `prob.f.jac_prototype` in the `NonlinearFunction` definition, and the
3330
`AutoSparse` here simply refers to whether this `jac_prototype` should be generated
3431
automatically. For more details, see
3532
[SparseDiffTools.jl](https://github.com/JuliaDiff/SparseDiffTools.jl) and
36-
[Sparsity Detection Manual Entry](@ref sparsity-detection).
33+
[Sparsity Detection Manual Entry](@ref sparsity-detection), as well as the
34+
documentation of [ADTypes.jl](https://github.com/SciML/ADTypes.jl).
3735

3836
## API Reference
3937

38+
```@docs
39+
AutoSparse
40+
```
41+
4042
### Finite Differencing Backends
4143

4244
```@docs
4345
AutoFiniteDiff
44-
AutoSparseFiniteDiff
4546
```
4647

4748
### Forward Mode AD Backends
4849

4950
```@docs
5051
AutoForwardDiff
51-
AutoSparseForwardDiff
5252
AutoPolyesterForwardDiff
5353
```
5454

5555
### Reverse Mode AD Backends
5656

5757
```@docs
5858
AutoZygote
59-
AutoSparseZygote
6059
AutoEnzyme
61-
NonlinearSolve.AutoSparseEnzyme
6260
```

docs/src/basics/sparsity_detection.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -59,9 +59,9 @@ refer to the documentation there for more details.
5959
If you constructed a Nonlinear Solver with a sparse AD type, for example
6060

6161
```julia
62-
NewtonRaphson(; autodiff = AutoSparseForwardDiff())
62+
NewtonRaphson(; autodiff = AutoSparse(AutoForwardDiff()))
6363
# OR
64-
TrustRegion(; autodiff = AutoSparseZygote())
64+
TrustRegion(; autodiff = AutoSparse(AutoZygote()))
6565
```
6666

6767
then NonlinearSolve will automatically perform matrix coloring and use sparse

docs/src/tutorials/large_systems.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -128,7 +128,7 @@ include:
128128
In the next section, we will discuss how to declare a sparse Jacobian and how to use
129129
[Symbolics.jl](https://github.com/JuliaSymbolics/Symbolics.jl), to compute exact sparse
130130
jacobians. This is triggered if you pass in a sparse autodiff type such as
131-
`AutoSparseForwardDiff()`. If `Symbolics.jl` is loaded, then the default changes to
131+
`AutoSparse(AutoForwardDiff())`. If `Symbolics.jl` is loaded, then the default changes to
132132
Symbolic Sparsity Detection. See the manual entry on
133133
[Sparsity Detection](@ref sparsity-detection) for more details on the default.
134134

@@ -137,13 +137,13 @@ using BenchmarkTools # for @btime
137137
138138
@btime solve(prob_brusselator_2d, NewtonRaphson());
139139
@btime solve(prob_brusselator_2d,
140-
NewtonRaphson(; autodiff = AutoSparseForwardDiff(; chunksize = 32)));
140+
NewtonRaphson(; autodiff = AutoSparse(AutoForwardDiff(; chunksize = 32))));
141141
@btime solve(prob_brusselator_2d,
142-
NewtonRaphson(;
143-
autodiff = AutoSparseForwardDiff(; chunksize = 32), linsolve = KLUFactorization()));
142+
NewtonRaphson(; autodiff = AutoSparse(AutoForwardDiff(; chunksize = 32)),
143+
linsolve = KLUFactorization()));
144144
@btime solve(prob_brusselator_2d,
145-
NewtonRaphson(;
146-
autodiff = AutoSparseForwardDiff(; chunksize = 32), linsolve = KrylovJL_GMRES()));
145+
NewtonRaphson(; autodiff = AutoSparse(AutoForwardDiff(; chunksize = 32)),
146+
linsolve = KrylovJL_GMRES()));
147147
nothing # hide
148148
```
149149

src/NonlinearSolve.jl

Lines changed: 13 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ import PrecompileTools: @recompile_invalidations, @compile_workload, @setup_work
2727

2828
import SciMLBase: AbstractNonlinearAlgorithm, JacobianWrapper, AbstractNonlinearProblem,
2929
AbstractSciMLOperator, NLStats, _unwrap_val, has_jac, isinplace
30-
import SparseDiffTools: AbstractSparsityDetection, AutoSparseEnzyme
30+
import SparseDiffTools: AbstractSparsityDetection
3131
import StaticArraysCore: StaticArray, SVector, SArray, MArray, Size, SMatrix, MMatrix
3232
import SymbolicIndexingInterface: SymbolicIndexingInterface, ParameterIndexingProxy,
3333
symbolic_container, parameter_values, state_values,
@@ -36,9 +36,6 @@ end
3636

3737
@reexport using ADTypes, SciMLBase, SimpleNonlinearSolve
3838

39-
const AbstractSparseADType = Union{ADTypes.AbstractSparseFiniteDifferences,
40-
ADTypes.AbstractSparseForwardMode, ADTypes.AbstractSparseReverseMode}
41-
4239
# Type-Inference Friendly Check for Extension Loading
4340
is_extension_loaded(::Val) = false
4441

@@ -121,18 +118,18 @@ include("default.jl")
121118

122119
@compile_workload begin
123120
@sync begin
124-
for T in (Float32, Float64), (fn, u0) in nlfuncs
125-
Threads.@spawn NonlinearProblem(fn, T.(u0), T(2))
126-
end
127-
for (fn, u0) in nlfuncs
128-
Threads.@spawn NonlinearLeastSquaresProblem(fn, u0, 2.0)
129-
end
130-
for prob in probs_nls, alg in nls_algs
131-
Threads.@spawn solve(prob, alg; abstol = 1e-2, verbose = false)
132-
end
133-
for prob in probs_nlls, alg in nlls_algs
134-
Threads.@spawn solve(prob, alg; abstol = 1e-2, verbose = false)
135-
end
121+
for T in (Float32, Float64), (fn, u0) in nlfuncs
122+
Threads.@spawn NonlinearProblem(fn, T.(u0), T(2))
123+
end
124+
for (fn, u0) in nlfuncs
125+
Threads.@spawn NonlinearLeastSquaresProblem(fn, u0, 2.0)
126+
end
127+
for prob in probs_nls, alg in nls_algs
128+
Threads.@spawn solve(prob, alg; abstol = 1e-2, verbose = false)
129+
end
130+
for prob in probs_nlls, alg in nlls_algs
131+
Threads.@spawn solve(prob, alg; abstol = 1e-2, verbose = false)
132+
end
136133
end
137134
end
138135
end

src/adtypes.jl

Lines changed: 40 additions & 48 deletions
Original file line numberDiff line numberDiff line change
@@ -22,17 +22,6 @@ error into the derivative estimates.
2222
"""
2323
AutoFiniteDiff
2424

25-
"""
26-
AutoSparseFiniteDiff()
27-
28-
Sparse Version of [`AutoFiniteDiff`](@ref) that uses
29-
[FiniteDiff.jl](https://github.com/JuliaDiff/FiniteDiff.jl) and the column color vector of
30-
the Jacobian Matrix to efficiently compute the Sparse Jacobian.
31-
32-
- Supports both inplace and out-of-place functions
33-
"""
34-
AutoSparseFiniteDiff
35-
3625
"""
3726
AutoForwardDiff(; chunksize = nothing, tag = nothing)
3827
AutoForwardDiff{chunksize, tagType}(tag::tagType)
@@ -56,27 +45,6 @@ For type-stability of internal operations, a positive `chunksize` must be provid
5645
"""
5746
AutoForwardDiff
5847

59-
"""
60-
AutoSparseForwardDiff(; chunksize = nothing, tag = nothing)
61-
AutoSparseForwardDiff{chunksize, tagType}(tag::tagType)
62-
63-
Sparse Version of [`AutoForwardDiff`](@ref) that uses
64-
[ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) and the column color vector of
65-
the Jacobian Matrix to efficiently compute the Sparse Jacobian.
66-
67-
- Supports both inplace and out-of-place functions
68-
69-
For type-stability of internal operations, a positive `chunksize` must be provided.
70-
71-
### Keyword Arguments
72-
73-
- `chunksize`: Count of dual numbers that can be propagated simultaneously. Setting this
74-
number to a high value will lead to slowdowns. Use
75-
[`NonlinearSolve.pickchunksize`](@ref) to get a proper value.
76-
- `tag`: Used to avoid perturbation confusion. If set to `nothing`, we use a custom tag.
77-
"""
78-
AutoSparseForwardDiff
79-
8048
"""
8149
AutoPolyesterForwardDiff(; chunksize = nothing)
8250
@@ -108,20 +76,6 @@ jacobians.
10876
"""
10977
AutoZygote
11078

111-
"""
112-
AutoSparseZygote()
113-
114-
Sparse version of [`AutoZygote`](@ref) that uses
115-
[`Zygote.jl`](https://github.com/FluxML/Zygote.jl) and the row color vector of
116-
the Jacobian Matrix to efficiently compute the Sparse Jacobian.
117-
118-
- Supports only out-of-place functions
119-
120-
This is efficient only for long jacobians or if the maximum value of the row color vector is
121-
significantly lower than the maximum value of the column color vector.
122-
"""
123-
AutoSparseZygote
124-
12579
"""
12680
AutoEnzyme()
12781
@@ -134,15 +88,53 @@ and VJP support is currently not implemented.
13488
AutoEnzyme
13589

13690
"""
137-
AutoSparseEnzyme()
91+
AutoSparse(AutoEnzyme())
13892
13993
Sparse version of [`AutoEnzyme`](@ref) that uses
14094
[Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl) and the row color vector of
14195
the Jacobian Matrix to efficiently compute the Sparse Jacobian.
14296
14397
- Supports both inplace and out-of-place functions
14498
99+
This is efficient only for long jacobians or if the maximum value of the row color vector is
100+
significantly lower than the maximum value of the column color vector.
101+
102+
AutoSparse(AutoFiniteDiff())
103+
104+
Sparse Version of [`AutoFiniteDiff`](@ref) that uses
105+
[FiniteDiff.jl](https://github.com/JuliaDiff/FiniteDiff.jl) and the column color vector of
106+
the Jacobian Matrix to efficiently compute the Sparse Jacobian.
107+
108+
- Supports both inplace and out-of-place functions
109+
110+
AutoSparse(AutoForwardDiff(; chunksize = nothing, tag = nothing))
111+
AutoSparse(AutoForwardDiff{chunksize, tagType}(tag::tagType))
112+
113+
Sparse Version of [`AutoForwardDiff`](@ref) that uses
114+
[ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) and the column color vector of
115+
the Jacobian Matrix to efficiently compute the Sparse Jacobian.
116+
117+
- Supports both inplace and out-of-place functions
118+
119+
For type-stability of internal operations, a positive `chunksize` must be provided.
120+
121+
### Keyword Arguments
122+
123+
- `chunksize`: Count of dual numbers that can be propagated simultaneously. Setting this
124+
number to a high value will lead to slowdowns. Use
125+
[`NonlinearSolve.pickchunksize`](@ref) to get a proper value.
126+
127+
- `tag`: Used to avoid perturbation confusion. If set to `nothing`, we use a custom tag.
128+
129+
AutoSparse(AutoZygote())
130+
131+
Sparse version of [`AutoZygote`](@ref) that uses
132+
[`Zygote.jl`](https://github.com/FluxML/Zygote.jl) and the row color vector of
133+
the Jacobian Matrix to efficiently compute the Sparse Jacobian.
134+
135+
- Supports only out-of-place functions
136+
145137
This is efficient only for long jacobians or if the maximum value of the row color vector is
146138
significantly lower than the maximum value of the column color vector.
147139
"""
148-
AutoSparseEnzyme
140+
AutoSparse

src/algorithms/trust_region.jl

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,13 +26,14 @@ function TrustRegion(; concrete_jac = nothing, linsolve = nothing, precs = DEFAU
2626
shrink_factor::Real = 1 // 4, expand_factor::Real = 2 // 1,
2727
max_shrink_times::Int = 32, autodiff = nothing, vjp_autodiff = nothing)
2828
descent = Dogleg(; linsolve, precs)
29-
if autodiff isa
30-
Union{ADTypes.AbstractForwardMode, ADTypes.AbstractFiniteDifferencesMode}
29+
if autodiff !== nothing && ADTypes.mode(autodiff) isa ADTypes.ForwardMode
3130
forward_ad = autodiff
3231
else
3332
forward_ad = nothing
3433
end
35-
if isnothing(vjp_autodiff) && autodiff isa ADTypes.AbstractFiniteDifferencesMode
34+
if isnothing(vjp_autodiff) &&
35+
autodiff isa Union{ADTypes.AutoFiniteDiff, ADTypes.AutoFiniteDifferences}
36+
# TODO: why not just ForwardMode?
3637
vjp_autodiff = autodiff
3738
end
3839
trustregion = GenericTrustRegionScheme(;

src/core/generalized_first_order.jl

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -55,10 +55,14 @@ function GeneralizedFirstOrderAlgorithm{concrete_jac, name}(;
5555
descent, linesearch = missing, trustregion = missing,
5656
jacobian_ad = nothing, forward_ad = nothing, reverse_ad = nothing,
5757
max_shrink_times::Int = typemax(Int)) where {concrete_jac, name}
58-
forward_ad = ifelse(forward_ad !== nothing, forward_ad,
59-
ifelse(jacobian_ad isa ADTypes.AbstractForwardMode, jacobian_ad, nothing))
60-
reverse_ad = ifelse(reverse_ad !== nothing, reverse_ad,
61-
ifelse(jacobian_ad isa ADTypes.AbstractReverseMode, jacobian_ad, nothing))
58+
forward_ad = ifelse(forward_ad !== nothing,
59+
forward_ad,
60+
ifelse(jacobian_ad !== nothing && ADTypes.mode(jacobian_ad) isa ADTypes.ForwardMode,
61+
jacobian_ad, nothing))
62+
reverse_ad = ifelse(reverse_ad !== nothing,
63+
reverse_ad,
64+
ifelse(jacobian_ad !== nothing && ADTypes.mode(jacobian_ad) isa ADTypes.ReverseMode,
65+
jacobian_ad, nothing))
6266

6367
if linesearch !== missing && !(linesearch isa AbstractNonlinearSolveLineSearchAlgorithm)
6468
Base.depwarn("Passing in a `LineSearches.jl` algorithm directly is deprecated. \

src/globalization/line_search.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ function __internal_init(
122122
end
123123
else
124124
autodiff = get_concrete_reverse_ad(
125-
alg.autodiff, prob; check_forward_mode = true)
125+
alg.autodiff, prob; check_reverse_mode = true)
126126
vjp_op = VecJacOperator(prob, fu, u; autodiff)
127127
if isinplace(prob)
128128
g_cache = similar(u)

src/internal/approximate_initialization.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -149,7 +149,7 @@ function __internal_init(
149149
prob::AbstractNonlinearProblem, alg::TrueJacobianInitialization, solver, f::F, fu,
150150
u, p; linsolve = missing, internalnorm::IN = DEFAULT_NORM, kwargs...) where {F, IN}
151151
autodiff = get_concrete_forward_ad(
152-
alg.autodiff, prob; check_reverse_mode = false, kwargs...)
152+
alg.autodiff, prob; check_forward_mode = false, kwargs...)
153153
jac_cache = JacobianCache(prob, solver, prob.f, fu, u, p; autodiff, linsolve)
154154
J = alg.structure(jac_cache(nothing))
155155
return InitializedApproximateJacobianCache(

0 commit comments

Comments
 (0)