Skip to content

Commit 65c37c1

Browse files
remove typewarn
1 parent 6d03787 commit 65c37c1

File tree

2 files changed

+8
-12
lines changed

2 files changed

+8
-12
lines changed

docs/src/performance.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -13,16 +13,15 @@ not because the operations are faster, but because the memory usage is halved.
1313
Which means allocations occur much faster.
1414
And you use less memory.
1515

16-
1716
## Preserve inputs' types
1817

1918
Not only should your activation and loss functions be [type-stable](https://docs.julialang.org/en/v1/manual/performance-tips/#Write-%22type-stable%22-functions-1),
2019
they should also preserve the type of their inputs.
2120

2221
A very artificial example using an activation function like
2322

24-
```
25-
my_tanh(x) = Float64(tanh(x))
23+
```julia
24+
my_tanh(x) = Float64(tanh(x))
2625
```
2726

2827
will result in performance on `Float32` input orders of magnitude slower than the normal `tanh` would,
@@ -35,20 +34,21 @@ you will see a large slow-down.
3534
This can occur sneakily, because you can cause type-promotion by interacting with a numeric literals.
3635
E.g. the following will have run into the same problem as above:
3736

38-
```
39-
leaky_tanh(x) = 0.01*x + tanh(x)
37+
```julia
38+
leaky_tanh(x) = 0.01*x + tanh(x)
4039
```
4140

4241
While one could change the activation function (e.g. to use `0.01f0*x`), the idiomatic (and safe way) to avoid type casts whenever inputs changes is to use `oftype`:
43-
```
44-
leaky_tanh(x) = oftype(x/1, 0.01)*x + tanh(x)
45-
```
4642

43+
```julia
44+
leaky_tanh(x) = oftype(x/1, 0.01)*x + tanh(x)
45+
```
4746

4847
## Evaluate batches as Matrices of features
4948

5049
While it can sometimes be tempting to process your observations (feature vectors) one at a time
5150
e.g.
51+
5252
```julia
5353
function loss_total(xs::AbstractVector{<:Vector}, ys::AbstractVector{<:Vector})
5454
sum(zip(xs, ys)) do (x, y_target)

src/layers/basic.jl

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,6 @@ end
120120
@functor Dense
121121

122122
function (a::Dense)(x::AbstractVecOrMat)
123-
eltype(a.W) == eltype(x) || _dense_typewarn(a, x)
124123
W, b, σ = a.W, a.b, a.σ
125124
# reshape to handle dims > 1 as batch dimensions
126125
sz = size(x)
@@ -129,9 +128,6 @@ function (a::Dense)(x::AbstractVecOrMat)
129128
return reshape(x, :, sz[2:end]...)
130129
end
131130

132-
_dense_typewarn(d, x) = @warn "Element types don't match for layer $d, this will be slow." typeof(d.W) typeof(x) maxlog=1
133-
Zygote.@nograd _dense_typewarn
134-
135131
function Base.show(io::IO, l::Dense)
136132
print(io, "Dense(", size(l.W, 2), ", ", size(l.W, 1))
137133
l.σ == identity || print(io, ", ", l.σ)

0 commit comments

Comments
 (0)