Skip to content

Commit 7520e34

Browse files
committed
Small fixes
1 parent 837352a commit 7520e34

File tree

5 files changed

+18
-15
lines changed

5 files changed

+18
-15
lines changed

docs/src/lecture_11/data/mnist.jl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ function reshape_data(X::AbstractArray{T, 3}, y::AbstractVector) where T
1111
end
1212

1313
function train_or_load!(file_name, m, X, y; force=false, kwargs...)
14-
14+
1515
!isdir(dirname(file_name)) && mkpath(dirname(file_name))
1616

1717
if force || !isfile(file_name)
@@ -24,8 +24,8 @@ end
2424

2525
function load_data(dataset; onehot=false, T=Float32)
2626
classes = 0:9
27-
X_train, y_train = reshape_data(dataset.traindata(T)...)
28-
X_test, y_test = reshape_data(dataset.testdata(T)...)
27+
X_train, y_train = reshape_data(dataset(T, :train)[:]...)
28+
X_test, y_test = reshape_data(dataset(T, :test)[:]...)
2929
y_train = T.(y_train)
3030
y_test = T.(y_test)
3131

docs/src/lecture_11/data/mnist_gpu.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,8 +38,8 @@ end
3838

3939
function load_data(dataset; onehot=false, T=Float32)
4040
classes = 0:9
41-
X_train, y_train = reshape_data(dataset.traindata(T)...)
42-
X_test, y_test = reshape_data(dataset.testdata(T)...)
41+
X_train, y_train = reshape_data(dataset(T, :train)[:]...)
42+
X_test, y_test = reshape_data(dataset(T, :test)[:]...)
4343
y_train = T.(y_train)
4444
y_test = T.(y_test)
4545

docs/src/lecture_11/exercises.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ using Flux
44
using MLDatasets
55
using DataFrames
66
using Plots
7-
using Flux: onehotbatch, onecold, flatten
7+
using Flux: onehotbatch, onecold, flatten, params
88
99
Core.eval(Main, :(using Flux)) # hide
1010
ENV["DATADEPS_ALWAYS_ACCEPT"] = true
11-
MNIST.traindata()
11+
MNIST(Float32, :train)
1212
1313
function reshape_data(X::AbstractArray{<:Real, 3})
1414
s = size(X)
@@ -28,8 +28,8 @@ function train_or_load!(file_name, m, args...; force=false, kwargs...)
2828
end
2929
3030
function load_data(dataset; T=Float32, onehot=false, classes=0:9)
31-
X_train, y_train = dataset.traindata(T)
32-
X_test, y_test = dataset.testdata(T)
31+
X_train, y_train = dataset(T, :train)[:]
32+
X_test, y_test = dataset(T, :test)[:]
3333
3434
X_train = reshape_data(X_train)
3535
X_test = reshape_data(X_test)

docs/src/lecture_11/iris.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,8 @@ Because there are ``3`` classes and ``120`` samples in the training set, it retu
6262
We access the neural network parameters by using `params(m)`. We can select the second layer of `m` by `m[2]`. Since the second layer has ``5 `` input and ``3`` output neurons, its parameters are a matrix of size ``3\times 5`` and a vector of length ``3``. The parameters `params(m[2])` are a tuple of the matrix and the vector. This also implies that the parameters are initialized randomly, and we do not need to take care of it. We can easily modify any parameters.
6363

6464
```@example iris
65+
using Flux: params
66+
6567
params(m[2])[2] .= [-1;0;1]
6668
6769
nothing # hide

docs/src/lecture_11/nn.md

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ using MLDatasets
55
66
Core.eval(Main, :(using Flux)) # hide
77
ENV["DATADEPS_ALWAYS_ACCEPT"] = true
8-
X_train = MNIST.traindata()[1]
8+
X_train = MNIST(Float32, :train)[:][1]
99
1010
# imageplot(1 .- X_train, 1:15; nrows = 3, size=(800,480))
1111
@@ -37,8 +37,8 @@ We use the package [MLDatasets](https://juliaml.github.io/MLDatasets.jl/stable/)
3737
using MLDatasets
3838
3939
T = Float32
40-
X_train, y_train = MLDatasets.MNIST.traindata(T)
41-
X_test, y_test = MLDatasets.MNIST.testdata(T)
40+
X_train, y_train = MLDatasets.MNIST(T, :train)[:]
41+
X_test, y_test = MLDatasets.MNIST(T, :test)[:]
4242
4343
nothing # hide
4444
```
@@ -89,7 +89,6 @@ using Plots
8989
using ImageInspector
9090

9191
imageplot(1 .- X_train, inds; nrows=3, size=(800,480))
92-
9392
savefig("mnist_intro2.svg") # hide
9493
```
9594

@@ -161,8 +160,8 @@ using Flux
161160
using Flux: onehotbatch, onecold
162161
163162
function load_data(dataset; T=Float32, onehot=false, classes=0:9)
164-
X_train, y_train = dataset.traindata(T)
165-
X_test, y_test = dataset.testdata(T)
163+
X_train, y_train = dataset(T, :train)[:]
164+
X_test, y_test = dataset(T, :test)[:]
166165
167166
X_train = reshape_data(X_train)
168167
X_test = reshape_data(X_test)
@@ -363,6 +362,7 @@ To build the objective ``L``, we first specify the prediction function ``\operat
363362

364363
```@example nn
365364
using Random
365+
using Flux: flatten
366366
367367
Random.seed!(666)
368368
m = Chain(
@@ -392,6 +392,7 @@ We now write the function `train_model!` to train the neural network `m`. Since
392392

393393
```@example nn
394394
using BSON
395+
using Flux: params
395396
396397
function train_model!(m, L, X, y;
397398
opt = Descent(0.1),

0 commit comments

Comments
 (0)