You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/performance.md
+8-8Lines changed: 8 additions & 8 deletions
Original file line number
Diff line number
Diff line change
@@ -13,16 +13,15 @@ not because the operations are faster, but because the memory usage is halved.
13
13
Which means allocations occur much faster.
14
14
And you use less memory.
15
15
16
-
17
16
## Preserve inputs' types
18
17
19
18
Not only should your activation and loss functions be [type-stable](https://docs.julialang.org/en/v1/manual/performance-tips/#Write-%22type-stable%22-functions-1),
20
19
they should also preserve the type of their inputs.
21
20
22
21
A very artificial example using an activation function like
23
22
24
-
```
25
-
my_tanh(x) = Float64(tanh(x))
23
+
```julia
24
+
my_tanh(x) =Float64(tanh(x))
26
25
```
27
26
28
27
will result in performance on `Float32` input orders of magnitude slower than the normal `tanh` would,
@@ -35,20 +34,21 @@ you will see a large slow-down.
35
34
This can occur sneakily, because you can cause type-promotion by interacting with a numeric literals.
36
35
E.g. the following will have run into the same problem as above:
37
36
38
-
```
39
-
leaky_tanh(x) = 0.01*x + tanh(x)
37
+
```julia
38
+
leaky_tanh(x) =0.01*x +tanh(x)
40
39
```
41
40
42
41
While one could change the activation function (e.g. to use `0.01f0*x`), the idiomatic (and safe way) to avoid type casts whenever inputs changes is to use `oftype`:
43
-
```
44
-
leaky_tanh(x) = oftype(x/1, 0.01)*x + tanh(x)
45
-
```
46
42
43
+
```julia
44
+
leaky_tanh(x) =oftype(x/1, 0.01)*x +tanh(x)
45
+
```
47
46
48
47
## Evaluate batches as Matrices of features
49
48
50
49
While it can sometimes be tempting to process your observations (feature vectors) one at a time
0 commit comments