Skip to content

Commit 694f519

Browse files
deprecate weight keyword arg in Conv constructor
1 parent a779696 commit 694f519

File tree

5 files changed

+11
-8
lines changed

5 files changed

+11
-8
lines changed

NEWS.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
been removed in favour of MLDatasets.jl.
77
* `params` is not exported anymore since it is a common name and is also exported by Distributions.jl
88
* `flatten` is not exported anymore due to clash with Iterators.flatten.
9+
* Remove Juno.jl progress bar support as it is now obsolete.
910

1011
## v0.12.9
1112
* Fixed incorrect output and added GPU compatibility for [AlphaDropout](https://github.com/FluxML/Flux.jl/pull/1781).

Project.toml

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,6 @@ Adapt = "79e6a3ab-5dfb-504d-930d-738a2a938a0e"
77
ArrayInterface = "4fba245c-0d91-5ea0-9b3e-6abc04ee57a9"
88
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"
99
Functors = "d9f16b24-f501-4c13-a1f2-28368ffc5196"
10-
Juno = "e5e0dc1b-0480-54bc-9374-aad01c23163d"
1110
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
1211
MacroTools = "1914dd2f-81c6-5fcd-8719-6d5c9610ff09"
1312
NNlib = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
@@ -25,7 +24,6 @@ Adapt = "3.0"
2524
ArrayInterface = "3.1"
2625
CUDA = "3"
2726
Functors = "0.2.1"
28-
Juno = "0.8"
2927
MacroTools = "0.5"
3028
NNlib = "0.7.24"
3129
NNlibCUDA = "0.1.7"

docs/src/models/basics.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -35,9 +35,9 @@ These gradients are based on `x` and `y`. Flux works by instead taking gradients
3535
Machine learning often can have *hundreds* of parameters, so Flux lets you work with collections of parameters, via the `params` functions. You can get the gradient of all parameters used in a program without explicitly passing them in.
3636

3737
```jldoctest basics
38-
julia> x = [1, 2];
38+
julia> x = [2, 1];
3939
40-
julia> y = [3, 5];
40+
julia> y = [2, 0];
4141
4242
julia> gs = gradient(Flux.params(x, y)) do
4343
f(x, y)

src/layers/conv.jl

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -136,8 +136,13 @@ end
136136

137137
function Conv(k::NTuple{N,Integer}, ch::Pair{<:Integer,<:Integer}, σ = identity;
138138
init = glorot_uniform, stride = 1, pad = 0, dilation = 1, groups = 1,
139-
weight = convfilter(k, ch; init, groups), bias = true) where N
140-
139+
weight = nothing, bias = true) where N
140+
if weight !== nothing
141+
# TODO remove in v0.14
142+
Base.depwarn("The `weight` keyword arg is deprecated, use the Conv(weight, ...) constructor instead", :conv_weight)
143+
else
144+
weight = convfilter(k, ch; init, groups)
145+
end
141146
Conv(weight, bias, σ; stride, pad, dilation, groups)
142147
end
143148

src/optimise/train.jl

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
1-
using Juno
21
import Zygote: Params, gradient
32

43
"""
@@ -139,7 +138,7 @@ hello
139138
```
140139
"""
141140
macro epochs(n, ex)
142-
:(@progress for i = 1:$(esc(n))
141+
:(for i = 1:$(esc(n))
143142
@info "Epoch $i"
144143
$(esc(ex))
145144
end)

0 commit comments

Comments
 (0)