Skip to content

Commit 9b75dbb

Browse files
authored
note about promotion (#94)
1 parent b0cbeaa commit 9b75dbb

File tree

1 file changed

+10
-2
lines changed

1 file changed

+10
-2
lines changed

src/destructure.jl

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,12 +11,20 @@ Differentiable.
1111
1212
# Example
1313
```jldoctest
14-
julia> v, re = destructure((x=[1.0, 2.0], y=(sin, [3 + 4im])))
14+
julia> v, re = destructure((x=[1.0, 2.0], y=(sin, [3.0 + 4.0im])))
1515
(ComplexF64[1.0 + 0.0im, 2.0 + 0.0im, 3.0 + 4.0im], Restructure(NamedTuple, ..., 3))
1616
17-
julia> re([3, 5-im, 7+11im])
17+
julia> re([3, 5, 7+11im])
1818
(x = [3.0, 5.0], y = (sin, ComplexF64[7.0 + 11.0im]))
1919
```
20+
21+
If `model` contains various number types, they are promoted to make `vector`,
22+
and are usually restored by `Restructure`. Such restoration follows the rules
23+
of `ChainRulesCore.ProjectTo`, and thus will restore floating point precision,
24+
but will permit more exotic numbers like `ForwardDiff.Dual`.
25+
26+
If `model` contains only GPU arrays, then `vector` will also live on the GPU.
27+
At present, a mixture of GPU and ordinary CPU arrays is undefined behaviour.
2028
"""
2129
function destructure(x)
2230
flat, off, len = _flatten(x)

0 commit comments

Comments
 (0)