Skip to content

Commit e0f156c

Browse files
authored
Cleanup before release (#149)
1 parent 738b482 commit e0f156c

File tree

7 files changed

+13
-14
lines changed

7 files changed

+13
-14
lines changed

CITATION.bib

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ @misc{ImplicitDifferentiation.jl
44
url = {https://github.com/gdalle/ImplicitDifferentiation.jl},
55
version = {v0.6.0},
66
year = {2024},
7-
month = {4}
7+
month = {6}
88
}
99

1010
@phdthesis{dalle:tel-04053322,

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,13 +34,13 @@ If you want a deeper dive into the theory, you can refer to the paper [_Efficien
3434
To install the stable version, open a Julia REPL and run:
3535

3636
```julia
37-
julia> using Pkg; Pkg.add("ImplicitDifferentiation")
37+
using Pkg; Pkg.add("ImplicitDifferentiation")
3838
```
3939

4040
For the latest version, run this instead:
4141

4242
```julia
43-
julia> using Pkg; Pkg.add(url="https://github.com/JuliaDecisionFocusedLearning/ImplicitDifferentiation.jl")
43+
using Pkg; Pkg.add(url="https://github.com/JuliaDecisionFocusedLearning/ImplicitDifferentiation.jl")
4444
```
4545

4646
Please read the [documentation](https://JuliaDecisionFocusedLearning.github.io/ImplicitDifferentiation.jl/stable/), especially the examples and FAQ.

examples/0_intro.jl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -81,17 +81,17 @@ We represent it using a type called [`ImplicitFunction`](@ref), which you will s
8181
=#
8282

8383
#=
84-
First we define a forward mapping corresponding to the function we consider.
84+
First we define a `forward` mapping corresponding to the function we consider.
8585
It returns the actual output $y(x)$ of the function, and can be thought of as a black box solver.
86-
Importantly, this Julia callable _doesn't need to be differentiable by automatic differentiation packages but the underlying function still needs to be mathematically differentiable_.
86+
Importantly, this Julia callable doesn't need to be differentiable by automatic differentiation packages but the underlying function still needs to be mathematically differentiable.
8787
=#
8888

8989
forward(x) = badsqrt(x);
9090

9191
#=
9292
Then we define `conditions` $c(x, y) = 0$ that the output $y(x)$ is supposed to satisfy.
9393
These conditions must be array-valued, with the same size as $y$.
94-
Unlike the forward mapping, _the conditions need to be differentiable by automatic differentiation packages_ with respect to both $x$ and $y$.
94+
Unlike the forward mapping, the conditions need to be differentiable by automatic differentiation packages with respect to both $x$ and $y$.
9595
Here the conditions are very obvious: the square of the square root should be equal to the original value.
9696
=#
9797

examples/1_basic.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ ForwardDiff.jacobian(_x -> implicit_optim(_x; method=LBFGS()), x)
8686
@test ForwardDiff.jacobian(_x -> implicit_optim(_x; method=LBFGS()), x) J #src
8787

8888
#=
89-
In this instance, we could use ForwardDiff.jl directly on the solver, but it returns the wrong result (not sure why).
89+
In this instance, we could use ForwardDiff.jl directly on the solver:
9090
=#
9191

9292
ForwardDiff.jacobian(_x -> forward_optim(_x; method=LBFGS()), x)

examples/2_advanced.jl

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,7 @@
11
# # Advanced use cases
22

33
#=
4-
We dive into more advanced applications of implicit differentiation:
5-
- constrained optimization problems
4+
We dive into more advanced applications of implicit differentiation.
65
=#
76

87
using ForwardDiff

examples/3_tricks.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ function conditions_components_aux(a, b, m, d, e)
2828
return c_d, c_e
2929
end;
3030

31-
# You can use `ComponentVector` as an intermediate storage.
31+
# You can use `ComponentVector` from [ComponentArrays.jl](https://github.com/jonniedie/ComponentArrays.jl) as an intermediate storage.
3232

3333
function forward_components(x::ComponentVector)
3434
d, e = forward_components_aux(x.a, x.b, x.m)

src/implicit_function.jl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -57,8 +57,8 @@ The value of `lazy` must be chosen together with the `linear_solver`, see below.
5757
- `forward`: a callable computing `y(x)`, does not need to be compatible with automatic differentiation
5858
- `conditions`: a callable computing `c(x, y)`, must be compatible with automatic differentiation
5959
- `linear_solver`: a callable to solve the linear system
60-
- `conditions_x_backend`: defines how the conditions will be differentiated with respect to the first argument `x`
61-
- `conditions_y_backend`: defines how the conditions will be differentiated with respect to the second argument `y`
60+
- `conditions_x_backend`: how the conditions will be differentiated w.r.t. the first argument `x`
61+
- `conditions_y_backend`: how the conditions will be differentiated w.r.t. the second argument `y`
6262
6363
# Function signatures
6464
@@ -79,7 +79,7 @@ The byproduct `z` and the other positional arguments `args...` beyond `x` are co
7979
8080
The provided `linear_solver` objects needs to be callable, with two methods:
8181
- `(A, b::AbstractVector) -> s::AbstractVector` such that `A * s = b`
82-
- `(A, B::AbstractVector) -> S::AbstractMatrix` such that `A * S = B`
82+
- `(A, B::AbstractMatrix) -> S::AbstractMatrix` such that `A * S = B`
8383
8484
It can be either a direct solver (like `\\`) or an iterative one (like [`KrylovLinearSolver`](@ref)).
8585
Typically, direct solvers work best with dense Jacobians (`lazy = false`) while iterative solvers work best with operators (`lazy = true`).
@@ -105,7 +105,7 @@ end
105105
forward, conditions;
106106
linear_solver=lazy ? KrylovLinearSolver() : \\,
107107
conditions_x_backend=nothing,
108-
conditions_x_backend=nothing,
108+
conditions_y_backend=nothing,
109109
)
110110
111111
Constructor for an [`ImplicitFunction`](@ref) which picks the `linear_solver` automatically based on the `lazy` parameter.

0 commit comments

Comments
 (0)