Skip to content

Commit 3ec6540

Browse files
committed
remove a few comments
1 parent ba975db commit 3ec6540

File tree

1 file changed

+6
-8
lines changed

1 file changed

+6
-8
lines changed

docs/src/models/quickstart.md

Lines changed: 6 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -6,18 +6,18 @@ If you haven't, then you might prefer the [Fitting a Straight Line](overview.md)
66

77
```julia
88
# With Julia 1.7+, this will prompt if neccessary to install everything, including CUDA:
9-
using Flux, Statistics
9+
using Flux, Zygote, Statistics
1010

1111
# Generate some data for the XOR problem: vectors of length 2, as columns of a matrix:
1212
noisy = rand(Float32, 2, 1000) # 2×1000 Matrix{Float32}
1313
truth = [xor(col[1]>0.5, col[2]>0.5) for col in eachcol(noisy)] # 1000-element Vector{Bool}
1414

1515
# Define our model, a multi-layer perceptron with one hidden layer of size 3:
1616
model = Chain(
17-
Dense(2 => 3, tanh), # activation function inside...
17+
Dense(2 => 3, tanh), # activation function inside layer
1818
BatchNorm(3),
1919
Dense(3 => 2),
20-
softmax) # ... but softmax outside a layer.
20+
softmax)
2121

2222
# The model encapsulates parameters, randomly initialised. Its initial output is:
2323
out1 = model(noisy) # 2×1000 Matrix{Float32}
@@ -34,15 +34,13 @@ opt = Flux.Adam(0.01) # will store optimiser momentum, etc.
3434
for epoch in 1:1_000
3535
losses = []
3636
for (x, y) in loader
37-
loss, grad = Flux.withgradient(pars) do
37+
loss, grad = Zygote.withgradient(pars) do
3838
# Evaluate model and loss inside gradient context:
3939
y_hat = model(x)
40-
Flux.crossentropy(y_hat, y) # could use just sum(abs2, y_hat .- y)
40+
Flux.crossentropy(y_hat, y)
4141
end
42-
# Use the gradient to update the model's parameters (and momentum):
4342
Flux.update!(opt, pars, grad)
44-
# Logging code, outside gradient context:
45-
push!(losses, loss)
43+
push!(losses, loss) # logging, outside gradient context
4644
end
4745
if isinteger(log2(epoch))
4846
println("after epoch $epoch, loss is ", mean(losses))

0 commit comments

Comments
 (0)