Skip to content

Commit 80fb0f1

Browse files
committed
Make private for now
1 parent 7c8fad5 commit 80fb0f1

File tree

2 files changed

+1
-20
lines changed

2 files changed

+1
-20
lines changed

NEWS.md

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,5 @@
11
# Flux Release Notes
22

3-
## v0.13.5
4-
* `loadmodel!` now supports a [`filter` keyword argument](https://github.com/FluxML/Flux.jl/pull/2041)
5-
63
## v0.13.4
74
* Added [`PairwiseFusion` layer](https://github.com/FluxML/Flux.jl/pull/1983)
85

src/loading.jl

Lines changed: 1 addition & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ _filter_children(f, children::NamedTuple) =
3333
_filter_children(f, children) = filter(f, children)
3434

3535
"""
36-
loadmodel!(dst, src; filter = _ -> true)
36+
loadmodel!(dst, src)
3737
3838
Copy all the parameters (trainable and non-trainable) from `src` into `dst`.
3939
@@ -43,9 +43,6 @@ Non-array elements (such as activation functions) are not copied and need not ma
4343
Zero bias vectors and `bias=false` are considered equivalent
4444
(see extended help for more details).
4545
46-
Specify the predicate function `filter` to control what is recursed.
47-
A child node `x` in either `dst` and `src` is skipped when `filter(x) == false`.
48-
4946
# Examples
5047
```julia
5148
julia> dst = Chain(Dense(Flux.ones32(2, 5), Flux.ones32(2), tanh), Dense(2 => 1; bias = [1f0]))
@@ -66,19 +63,6 @@ false
6663
6764
julia> iszero(dst[2].bias)
6865
true
69-
70-
julia> src = Chain(Dense(5 => 2), Dropout(0.2), Dense(2 => 1))
71-
Chain(
72-
Dense(5 => 2), # 12 parameters
73-
Dropout(0.2),
74-
Dense(2 => 1), # 3 parameters
75-
) # Total: 4 arrays, 15 parameters, 348 bytes.
76-
77-
julia> Flux.loadmodel!(dst, src; filter = x -> !(x isa Dropout)) # skips loading Dropout
78-
Chain(
79-
Dense(5 => 2, tanh), # 12 parameters
80-
Dense(2 => 1), # 3 parameters
81-
) # Total: 4 arrays, 15 parameters, 316 bytes.
8266
```
8367
8468
# Extended help

0 commit comments

Comments
 (0)