Skip to content

Commit 8310daf

Browse files
v0.13 deprecations
1 parent 841afe7 commit 8310daf

28 files changed

+162
-719
lines changed

NEWS.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,12 @@
11
# Flux Release Notes
22

3+
## v0.13
4+
* After a deprecations cycle, the datasets in `Flux.Data` have
5+
been removed in favour of MLDatasets.jl.
6+
* `params` is not exported anymore since it is a common name and is also exported by Distributions.jl
7+
* `flatten` is not exported anymore due to clash with Iterators.flatten.
8+
* Remove Juno.jl progress bar support as it is now obsolete.
9+
310
## v0.12.10
411
* `Dropout`/`AlphaDropout` now supports [user-specified RNGs](https://github.com/FluxML/Flux.jl/pull/1838)
512

Project.toml

Lines changed: 1 addition & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,11 @@
11
name = "Flux"
22
uuid = "587475ba-b771-5e3f-ad9e-33799f191a9c"
3-
version = "0.12.9"
3+
version = "0.13.0-DEV"
44

55
[deps]
6-
AbstractTrees = "1520ce14-60c1-5f80-bbc7-55ef81b5835c"
76
Adapt = "79e6a3ab-5dfb-504d-930d-738a2a938a0e"
87
ArrayInterface = "4fba245c-0d91-5ea0-9b3e-6abc04ee57a9"
98
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"
10-
CodecZlib = "944b1d66-785c-5afd-91f1-9de20f533193"
11-
Colors = "5ae59095-9a9b-59fe-a467-6f913c188581"
12-
DelimitedFiles = "8bb1440f-4735-579b-a4ab-409b98df4dab"
139
Functors = "d9f16b24-f501-4c13-a1f2-28368ffc5196"
1410
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
1511
MacroTools = "1914dd2f-81c6-5fcd-8719-6d5c9610ff09"
@@ -20,29 +16,23 @@ Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
2016
ProgressLogging = "33c8b6b6-d38a-422a-b730-caa89a2f386c"
2117
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
2218
Reexport = "189a3867-3050-52da-a836-e630ba90ab69"
23-
SHA = "ea8e919c-243c-51af-8825-aaa63cd721ce"
2419
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
2520
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
2621
StatsBase = "2913bbd2-ae8a-5f71-8c99-4fb6c76f3a91"
2722
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
28-
ZipFile = "a5390f91-8eb1-5f08-bee0-b1d1ffed6cea"
2923
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
3024

3125
[compat]
32-
AbstractTrees = "0.3"
3326
Adapt = "3.0"
3427
ArrayInterface = "3.1, 4"
3528
CUDA = "3"
36-
CodecZlib = "0.7"
37-
Colors = "0.12"
3829
Functors = "0.2.1"
3930
MacroTools = "0.5"
4031
NNlib = "0.8"
4132
NNlibCUDA = "0.2"
4233
ProgressLogging = "0.1"
4334
Reexport = "0.2, 1.0"
4435
StatsBase = "0.33"
45-
ZipFile = "0.9"
4636
Zygote = "0.6"
4737
julia = "1.6"
4838

docs/src/models/advanced.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -97,8 +97,8 @@ We can freeze a specific parameter of a specific layer which already entered a `
9797
by simply deleting it from `ps`:
9898

9999
```julia
100-
ps = params(m)
101-
delete!(ps, m[2].bias)
100+
ps = Flux.params(m)
101+
delete!(ps, m[2].bias)
102102
```
103103

104104
## Custom multiple input or output layer

docs/src/models/basics.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ julia> x = [2, 1];
3939
4040
julia> y = [2, 0];
4141
42-
julia> gs = gradient(params(x, y)) do
42+
julia> gs = gradient(Flux.params(x, y)) do
4343
f(x, y)
4444
end
4545
Grads(...)
@@ -83,7 +83,7 @@ To improve the prediction we can take the gradients of the loss with respect to
8383
```julia
8484
using Flux
8585

86-
gs = gradient(() -> loss(x, y), params(W, b))
86+
gs = gradient(() -> loss(x, y), Flux.params(W, b))
8787
```
8888

8989
Now that we have gradients, we can pull them out and update `W` to train the model.

docs/src/models/recurrence.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -160,7 +160,7 @@ data = zip(X,Y)
160160
Flux.reset!(m)
161161
[m(x) for x in seq_init]
162162

163-
ps = params(m)
163+
ps = Flux.params(m)
164164
opt= ADAM(1e-3)
165165
Flux.train!(loss, ps, data, opt)
166166
```

docs/src/saving.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ julia> using Flux
6262
julia> model = Chain(Dense(10,5,relu),Dense(5,2),softmax)
6363
Chain(Dense(10, 5, NNlib.relu), Dense(5, 2), NNlib.softmax)
6464

65-
julia> weights = params(model);
65+
julia> weights = Flux.params(model);
6666

6767
julia> using BSON: @save
6868

docs/src/training/optimisers.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ loss(x, y) = sum((predict(x) .- y).^2)
1414
x, y = rand(5), rand(2) # Dummy data
1515
l = loss(x, y) # ~ 3
1616

17-
θ = params(W, b)
17+
θ = Flux.params(W, b)
1818
grads = gradient(() -> loss(x, y), θ)
1919
```
2020

docs/src/training/training.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ At first glance it may seem strange that the model that we want to train is not
6464

6565
## Model parameters
6666

67-
The model to be trained must have a set of tracked parameters that are used to calculate the gradients of the objective function. In the [basics](../models/basics.md) section it is explained how to create models with such parameters. The second argument of the function `Flux.train!` must be an object containing those parameters, which can be obtained from a model `m` as `params(m)`.
67+
The model to be trained must have a set of tracked parameters that are used to calculate the gradients of the objective function. In the [basics](../models/basics.md) section it is explained how to create models with such parameters. The second argument of the function `Flux.train!` must be an object containing those parameters, which can be obtained from a model `m` as `Flux.params(m)`.
6868

6969
Such an object contains a reference to the model's parameters, not a copy, such that after their training, the model behaves according to their updated values.
7070

src/Flux.jl

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,19 +4,23 @@ module Flux
44

55
using Base: tail
66
using Statistics, Random, LinearAlgebra
7+
<<<<<<< HEAD
78
using Zygote, MacroTools, ProgressLogging, Reexport
9+
=======
10+
using Zygote, MacroTools, Reexport
11+
>>>>>>> 3e86b468 (v0.13 deprecations)
812
using MacroTools: @forward
913
@reexport using NNlib
1014
using Zygote: Params, @adjoint, gradient, pullback, @nograd
1115
export gradient
1216

13-
export Chain, Dense, Maxout, SkipConnection, Parallel, flatten,
17+
export Chain, Dense, Maxout, SkipConnection, Parallel,
1418
RNN, LSTM, GRU, GRUv3,
1519
SamePad, Conv, CrossCor, ConvTranspose, DepthwiseConv,
1620
AdaptiveMaxPool, AdaptiveMeanPool, GlobalMaxPool, GlobalMeanPool, MaxPool, MeanPool,
1721
Dropout, AlphaDropout, LayerNorm, BatchNorm, InstanceNorm, GroupNorm,
1822
Upsample, PixelShuffle,
19-
params, fmap, cpu, gpu, f32, f64,
23+
fmap, cpu, gpu, f32, f64,
2024
testmode!, trainmode!
2125

2226
include("optimise/Optimise.jl")

src/data/Data.jl

Lines changed: 0 additions & 58 deletions
Original file line numberDiff line numberDiff line change
@@ -6,62 +6,4 @@ using Base: @propagate_inbounds
66
include("dataloader.jl")
77
export DataLoader
88

9-
## TODO for v0.13: remove everything below ##############
10-
## Also remove the following deps:
11-
## AbstractTrees, ZipFiles, CodecZLib
12-
13-
import ..Flux
14-
import SHA
15-
16-
deprecation_message() = @warn("Flux's datasets are deprecated, please use the package MLDatasets.jl")
17-
18-
function deps(path...)
19-
if isnothing(@__DIR__) # sysimages
20-
joinpath("deps", path...)
21-
else
22-
joinpath(@__DIR__, "..", "..", "deps", path...)
23-
end
24-
end
25-
26-
function download_and_verify(url, path, hash)
27-
tmppath = tempname()
28-
download(url, tmppath)
29-
hash_download = open(tmppath) do f
30-
bytes2hex(SHA.sha256(f))
31-
end
32-
if hash_download !== hash
33-
msg = "Hash Mismatch!\n"
34-
msg *= " Expected sha256: $hash\n"
35-
msg *= " Calculated sha256: $hash_download"
36-
error(msg)
37-
end
38-
mv(tmppath, path; force=true)
39-
end
40-
41-
function __init__()
42-
mkpath(deps())
43-
end
44-
45-
include("mnist.jl")
46-
export MNIST
47-
48-
include("fashion-mnist.jl")
49-
export FashionMNIST
50-
51-
include("cmudict.jl")
52-
export CMUDict
53-
using .CMUDict; export cmudict
54-
55-
include("tree.jl")
56-
include("sentiment.jl")
57-
export Sentiment
58-
59-
include("iris.jl")
60-
export Iris
61-
62-
include("housing.jl")
63-
export Housing
64-
65-
#########################################
66-
679
end#module

0 commit comments

Comments
 (0)