Skip to content

Commit 2f32201

Browse files
committed
Refactorization
1 parent dd74cf0 commit 2f32201

File tree

9 files changed

+146
-140
lines changed

9 files changed

+146
-140
lines changed

docs/make.jl

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,8 @@ lecture_01 = [
4343

4444
lecture_02 = [
4545
"Arrays" => "./lecture_02/arrays.md",
46-
"Data structures" => "./lecture_02/data_structures.md",
46+
"Tuples and named tuples" => "./lecture_02/tuples.md",
47+
"Dictionaries" => "./lecture_02/dictionaries.md",
4748
]
4849

4950
lecture_03 = [
@@ -90,19 +91,22 @@ lecture_08 = joinpath.("./lecture_08/", [
9091
"unconstrained.md",
9192
"constrained.md",
9293
"exercises.md",
94+
"homework.md",
9395
])
9496

9597
lecture_09 = joinpath.("./lecture_09/", [
9698
"theory.md",
9799
"linear.md",
98100
"logistic.md",
99101
"exercises.md",
102+
"homework.md",
100103
])
101104

102105
lecture_10 = joinpath.("./lecture_10/", [
103106
"theory.md",
104107
"nn.md",
105108
"exercises.md",
109+
"homework.md",
106110
])
107111

108112
lecture_11 = joinpath.("./lecture_11/", [

docs/src/lecture_02/dictionaries.md

Lines changed: 98 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,98 @@
1+
# Dictionaries
2+
3+
Dictionaries are mutable, unordered (random order) collections of pairs of keys and values. The syntax for creating a dictionary is:
4+
5+
```jldoctest dicts
6+
julia> d = Dict("a" => [1, 2, 3], "b" => 1)
7+
Dict{String, Any} with 2 entries:
8+
"b" => 1
9+
"a" => [1, 2, 3]
10+
```
11+
12+
Another possibility is to use symbols instead of strings as keys.
13+
14+
```jldoctest dicts
15+
julia> d = Dict(:a => [1, 2, 3], :b => 1)
16+
Dict{Symbol, Any} with 2 entries:
17+
:a => [1, 2, 3]
18+
:b => 1
19+
```
20+
21+
It is possible to use almost any type as a key in a dictionary. Dictionary's elements can be accessed via square brackets and a key.
22+
23+
```jldoctest dicts
24+
julia> d[:a]
25+
3-element Vector{Int64}:
26+
1
27+
2
28+
3
29+
```
30+
31+
If the key does not exist in the dictionary, an error will occur if we try to access it.
32+
33+
```jldoctest dicts
34+
julia> d[:c]
35+
ERROR: KeyError: key :c not found
36+
37+
julia> haskey(d, :c)
38+
false
39+
```
40+
41+
The `haskey` function checks whether the dictionary has the `:c` key. To avoid such errors, we can use the `get` function that accepts three arguments: a dictionary, key, and a default value for this key, which is returned if the key does not exist in the dictionary.
42+
43+
```jldoctest dicts
44+
julia> get(d, :c, 42)
45+
42
46+
```
47+
48+
There is also an in-place version of the `get` function. The `get!` function adds the default value to the dictionary if the key does not exist.
49+
50+
```jldoctest dicts
51+
julia> get!(d, :c, 42)
52+
42
53+
54+
julia> get!(d, :d, ["hello", "world"])
55+
2-element Vector{String}:
56+
"hello"
57+
"world"
58+
59+
julia> d
60+
Dict{Symbol, Any} with 4 entries:
61+
:a => [1, 2, 3]
62+
:b => 1
63+
:d => ["hello", "world"]
64+
:c => 42
65+
```
66+
67+
Unwanted keys from the dictionary can be removed by the `delete!` function.
68+
69+
```jldoctest dicts
70+
julia> delete!(d, :d)
71+
Dict{Symbol, Any} with 3 entries:
72+
:a => [1, 2, 3]
73+
:b => 1
74+
:c => 42
75+
76+
julia> haskey(d, :d)
77+
false
78+
```
79+
80+
An alternative is the `pop!` function, which removes the key from the dictionary, and returns the value corresponding to it.
81+
82+
```jldoctest dicts
83+
julia> pop!(d, :c)
84+
42
85+
86+
julia> haskey(d, :c)
87+
false
88+
```
89+
90+
Optionally, it is possible to add a default value for a given key to the `pop!` function, which is returned if the key does not exist in the given dictionary.
91+
92+
```jldoctest dicts
93+
julia> haskey(d, :c)
94+
false
95+
96+
julia> pop!(d, :c, 444)
97+
444
98+
```

docs/src/lecture_02/data_structures.md renamed to docs/src/lecture_02/tuples.md

Lines changed: 1 addition & 100 deletions
Original file line numberDiff line numberDiff line change
@@ -130,103 +130,4 @@ julia> a, b, c = t
130130
131131
julia> println("The values stored in the tuple are: a = $a, b = $b")
132132
The values stored in the tuple are: a = 1, b = 2.0
133-
```
134-
135-
## Dictionaries
136-
137-
Dictionaries are mutable, unordered (random order) collections of pairs of keys and values. The syntax for creating a dictionary is:
138-
139-
```jldoctest dicts
140-
julia> d = Dict("a" => [1, 2, 3], "b" => 1)
141-
Dict{String, Any} with 2 entries:
142-
"b" => 1
143-
"a" => [1, 2, 3]
144-
```
145-
146-
Another possibility is to use symbols instead of strings as keys.
147-
148-
```jldoctest dicts
149-
julia> d = Dict(:a => [1, 2, 3], :b => 1)
150-
Dict{Symbol, Any} with 2 entries:
151-
:a => [1, 2, 3]
152-
:b => 1
153-
```
154-
155-
It is possible to use almost any type as a key in a dictionary. Dictionary's elements can be accessed via square brackets and a key.
156-
157-
```jldoctest dicts
158-
julia> d[:a]
159-
3-element Vector{Int64}:
160-
1
161-
2
162-
3
163-
```
164-
165-
If the key does not exist in the dictionary, an error will occur if we try to access it.
166-
167-
```jldoctest dicts
168-
julia> d[:c]
169-
ERROR: KeyError: key :c not found
170-
171-
julia> haskey(d, :c)
172-
false
173-
```
174-
175-
The `haskey` function checks whether the dictionary has the `:c` key. To avoid such errors, we can use the `get` function that accepts three arguments: a dictionary, key, and a default value for this key, which is returned if the key does not exist in the dictionary.
176-
177-
```jldoctest dicts
178-
julia> get(d, :c, 42)
179-
42
180-
```
181-
182-
There is also an in-place version of the `get` function. The `get!` function adds the default value to the dictionary if the key does not exist.
183-
184-
```jldoctest dicts
185-
julia> get!(d, :c, 42)
186-
42
187-
188-
julia> get!(d, :d, ["hello", "world"])
189-
2-element Vector{String}:
190-
"hello"
191-
"world"
192-
193-
julia> d
194-
Dict{Symbol, Any} with 4 entries:
195-
:a => [1, 2, 3]
196-
:b => 1
197-
:d => ["hello", "world"]
198-
:c => 42
199-
```
200-
201-
Unwanted keys from the dictionary can be removed by the `delete!` function.
202-
203-
```jldoctest dicts
204-
julia> delete!(d, :d)
205-
Dict{Symbol, Any} with 3 entries:
206-
:a => [1, 2, 3]
207-
:b => 1
208-
:c => 42
209-
210-
julia> haskey(d, :d)
211-
false
212-
```
213-
214-
An alternative is the `pop!` function, which removes the key from the dictionary, and returns the value corresponding to it.
215-
216-
```jldoctest dicts
217-
julia> pop!(d, :c)
218-
42
219-
220-
julia> haskey(d, :c)
221-
false
222-
```
223-
224-
Optionally, it is possible to add a default value for a given key to the `pop!` function, which is returned if the key does not exist in the given dictionary.
225-
226-
```jldoctest dicts
227-
julia> haskey(d, :c)
228-
false
229-
230-
julia> pop!(d, :c, 444)
231-
444
232-
```
133+
```

docs/src/lecture_08/exercises.md

Lines changed: 0 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,27 +1,5 @@
11
# [Exercises](@id l7-exercises)
22

3-
```@raw html
4-
<div class="admonition is-category-homework">
5-
<header class="admonition-header">Homework: Newton's method</header>
6-
<div class="admonition-body">
7-
```
8-
9-
Newton's method for solving equation ``g(x)=0`` is an iterative procedure which at every iteration ``x^k`` approximates the function ``g(x)`` by its first-order (linear) expansion ``g(x) \approx g(x^k) + \nabla g(x^k)(x-x^k)`` and finds the zero point of this approximation.
10-
11-
Newton's method for unconstrained optimization replaces the optimization problem by its optimality condition and solves the resulting equation.
12-
13-
Implement Newton's method to minimize
14-
15-
```math
16-
f(x) = e^{x_1^2 + x_2^2 - 1} + (x_1-1)^2
17-
```
18-
19-
with the starting point ``x^0=(0,0)``.
20-
21-
```@raw html
22-
</div></div>
23-
```
24-
253
```@raw html
264
<div class="admonition is-category-exercise">
275
<header class="admonition-header">Exercise 1: Solving a system of linear equations</header>

docs/src/lecture_08/homework.md

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
# Homework
2+
3+
```@raw html
4+
<div class="admonition is-category-homework">
5+
<header class="admonition-header">Homework: Newton's method</header>
6+
<div class="admonition-body">
7+
```
8+
9+
Newton's method for solving equation ``g(x)=0`` is an iterative procedure which at every iteration ``x^k`` approximates the function ``g(x)`` by its first-order (linear) expansion ``g(x) \approx g(x^k) + \nabla g(x^k)(x-x^k)`` and finds the zero point of this approximation.
10+
11+
Newton's method for unconstrained optimization replaces the optimization problem by its optimality condition and solves the resulting equation.
12+
13+
Implement Newton's method to minimize
14+
15+
```math
16+
f(x) = e^{x_1^2 + x_2^2 - 1} + (x_1-1)^2
17+
```
18+
19+
with the starting point ``x^0=(0,0)``.
20+
21+
```@raw html
22+
</div></div>
23+
```

docs/src/lecture_09/exercises.md

Lines changed: 0 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -40,18 +40,6 @@ w = log_reg(X, y, zeros(size(X,2)))
4040

4141
# [Exercises](@id l8-exercises)
4242

43-
!!! homework "Homework: Data normalization"
44-
Data are often normalized. Each feature subtracts its mean and then divides the result by its standard deviation. The normalized features have zero mean and unit standard deviation. This may help in several cases:
45-
- When each feature has a different order of magnitude (such as millimetres and kilometres). Then the gradient would ignore the feature with the smaller values.
46-
- When problems such as vanishing gradients are present (we will elaborate on this in Exercise 4).
47-
48-
Write function ```normalize``` which takes as an input a dataset and normalizes it. Then train the same classifier as we did for [logistic regression](@ref log-reg). Use the original and normalized dataset. Which differences did you observe when
49-
- the logistic regression is optimized via gradient descent?
50-
- the logistic regression is optimized via Newton's method?
51-
Do you have any intuition as to why?
52-
53-
Write a short report (in LaTeX) summarizing your findings.
54-
5543
```@raw html
5644
<div class="admonition is-category-exercise">
5745
<header class="admonition-header">Exercise 1:</header>

docs/src/lecture_09/homework.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
# Homework
2+
3+
!!! homework "Homework: Data normalization"
4+
Data are often normalized. Each feature subtracts its mean and then divides the result by its standard deviation. The normalized features have zero mean and unit standard deviation. This may help in several cases:
5+
- When each feature has a different order of magnitude (such as millimetres and kilometres). Then the gradient would ignore the feature with the smaller values.
6+
- When problems such as vanishing gradients are present (we will elaborate on this in Exercise 4).
7+
8+
Write function ```normalize``` which takes as an input a dataset and normalizes it. Then train the same classifier as we did for [logistic regression](@ref log-reg). Use the original and normalized dataset. Which differences did you observe when
9+
- the logistic regression is optimized via gradient descent?
10+
- the logistic regression is optimized via Newton's method?
11+
Do you have any intuition as to why?
12+
13+
Write a short report (in LaTeX) summarizing your findings.

docs/src/lecture_10/exercises.md

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -108,11 +108,6 @@ y = iris.Species
108108

109109
# [Exercises](@id l9-exercises)
110110

111-
!!! homework "Homework: Optimal setting"
112-
Perform an analysis of hyperparameters of the neural network from this lecture. Examples may include network architecture, learning rate (stepsize), activation functions or normalization.
113-
114-
Write a short summary (in LaTeX) of your suggestions.
115-
116111
```@raw html
117112
<div class="admonition is-category-exercise">
118113
<header class="admonition-header">Exercise 1: Keyword arguments</header>

docs/src/lecture_10/homework.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
# Homework
2+
3+
!!! homework "Homework: Optimal setting"
4+
Perform an analysis of hyperparameters of the neural network from this lecture. Examples may include network architecture, learning rate (stepsize), activation functions or normalization.
5+
6+
Write a short summary (in LaTeX) of your suggestions.

0 commit comments

Comments
 (0)