Skip to content

Commit 51f8a38

Browse files
committed
Minor bug in the guide
1 parent 8f89bd7 commit 51f8a38

File tree

1 file changed

+7
-16
lines changed

1 file changed

+7
-16
lines changed

docs/src/getting_started/linear_regression.md

Lines changed: 7 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -247,19 +247,13 @@ The line fits well! There is room for improvement, but we leave that up to you!
247247
We now move on to a relatively complex linear regression model. Here we will use a real dataset from [`MLDatasets.jl`](https://github.com/JuliaML/MLDatasets.jl), which will not confine our data points to have only one feature. Let's start by importing the required packages -
248248

249249
```jldoctest linear_regression_complex
250-
julia> using Flux
251-
252-
julia> using Statistics
253-
254-
julia> using MLDatasets: BostonHousing
250+
julia> using Flux, Statistics, MLDatasets, DataFrames
255251
```
256252

257253
### Data
258254
Let's start by initializing our dataset. We will be using the [`BostonHousing`](https://juliaml.github.io/MLDatasets.jl/stable/datasets/misc/#MLDatasets.BostonHousing) dataset consisting of `506` data points. Each of these data points has `13` features and a corresponding label, the house's price. The `x`s are still mapped to a single `y`, but now, a single `x` data point has 13 features.
259255

260256
```jldoctest linear_regression_complex
261-
julia> using DataFrames
262-
263257
julia> dataset = BostonHousing()
264258
dataset BostonHousing:
265259
metadata => Dict{String, Any} with 5 entries
@@ -324,7 +318,7 @@ The training procedure would make use of the same mathematics, but now we can pa
324318

325319
```jldoctest linear_regression_complex
326320
julia> function train_model()
327-
dLdm, _, _ = gradient(loss, model, x, y)
321+
dLdm, _, _ = gradient(loss, model, x_train_n, y_train)
328322
@. model.weight = model.weight - 0.000001 * dLdm.weight
329323
@. model.bias = model.bias - 0.000001 * dLdm.bias
330324
end;
@@ -342,7 +336,7 @@ julia> while true
342336
loss_init = loss(model, x_train_n, y_train)
343337
continue
344338
end
345-
if abs(loss_init - loss(model, x_train_n, y_train)) < 1e-3
339+
if abs(loss_init - loss(model, x_train_n, y_train)) < 1e-4
346340
break
347341
else
348342
loss_init = loss(model, x_train_n, y_train)
@@ -386,8 +380,7 @@ After getting familiar with the basics of `Flux` and `Julia`, we moved ahead to
386380
## Copy-pastable code
387381
### Dummy dataset
388382
```julia
389-
using Flux
390-
using Plots
383+
using Flux, Plots
391384

392385
# data
393386
x = hcat(collect(Float32, -3:0.1:3)...)
@@ -430,9 +423,7 @@ plot!((x) -> b[1] + W[1] * x, -3, 3, label="Custom model", lw=2)
430423
```
431424
### Real dataset
432425
```julia
433-
using Flux
434-
using Statistics
435-
using MLDatasets: BostonHousing
426+
using Flux, Statistics, MLDatasets
436427

437428
# data
438429
x, y = BostonHousing(as_df=false)[:]
@@ -452,7 +443,7 @@ print("Initial loss: ", loss(model, x_train_n, y_train), "\n")
452443

453444
# train
454445
function train_custom_model()
455-
dLdm, _, _ = gradient(loss, model, x, y)
446+
dLdm, _, _ = gradient(loss, model, x_train_n, y_train)
456447
@. model.weight = model.weight - 0.000001 * dLdm.weight
457448
@. model.bias = model.bias - 0.000001 * dLdm.bias
458449
end
@@ -464,7 +455,7 @@ while true
464455
loss_init = loss(model, x_train_n, y_train)
465456
continue
466457
end
467-
if abs(loss_init - loss(model, x_train_n, y_train)) < 1e-3
458+
if abs(loss_init - loss(model, x_train_n, y_train)) < 1e-4
468459
break
469460
else
470461
loss_init = loss(model, x_train_n, y_train)

0 commit comments

Comments
 (0)