@@ -106,9 +106,9 @@ of label smoothing to binary distributions encoded in a single number.
106
106
# Example
107
107
```jldoctest
108
108
julia> y = Flux.onehotbatch([1, 1, 1, 0, 1, 0], 0:1)
109
- 2×6 Flux.OneHotArray{2,2, Vector{UInt32}} :
110
- 0 0 0 1 0 1
111
- 1 1 1 0 1 0
109
+ 2×6 OneHotMatrix(:: Vector{UInt32}) with eltype Bool :
110
+ ⋅ ⋅ ⋅ 1 ⋅ 1
111
+ 1 1 1 ⋅ 1 ⋅
112
112
113
113
julia> y_smoothed = Flux.label_smoothing(y, 0.2f0)
114
114
2×6 Matrix{Float32}:
@@ -171,10 +171,10 @@ See also: [`logitcrossentropy`](@ref), [`binarycrossentropy`](@ref), [`logitbina
171
171
# Example
172
172
```jldoctest
173
173
julia> y_label = Flux.onehotbatch([0, 1, 2, 1, 0], 0:2)
174
- 3×5 Flux.OneHotArray{3,2, Vector{UInt32}} :
175
- 1 0 0 0 1
176
- 0 1 0 1 0
177
- 0 0 1 0 0
174
+ 3×5 OneHotMatrix(:: Vector{UInt32}) with eltype Bool :
175
+ 1 ⋅ ⋅ ⋅ 1
176
+ ⋅ 1 ⋅ 1 ⋅
177
+ ⋅ ⋅ 1 ⋅ ⋅
178
178
179
179
julia> y_model = softmax(reshape(-7:7, 3, 5) .* 1f0)
180
180
3×5 Matrix{Float32}:
@@ -222,10 +222,10 @@ See also: [`binarycrossentropy`](@ref), [`logitbinarycrossentropy`](@ref), [`lab
222
222
# Example
223
223
```jldoctest
224
224
julia> y_label = Flux.onehotbatch(collect("abcabaa"), 'a':'c')
225
- 3×7 Flux.OneHotArray{3,2, Vector{UInt32}} :
226
- 1 0 0 1 0 1 1
227
- 0 1 0 0 1 0 0
228
- 0 0 1 0 0 0 0
225
+ 3×7 OneHotMatrix(:: Vector{UInt32}) with eltype Bool :
226
+ 1 ⋅ ⋅ 1 ⋅ 1 1
227
+ ⋅ 1 ⋅ ⋅ 1 ⋅ ⋅
228
+ ⋅ ⋅ 1 ⋅ ⋅ ⋅ ⋅
229
229
230
230
julia> y_model = reshape(vcat(-9:0, 0:9, 7.5f0), 3, 7)
231
231
3×7 Matrix{Float32}:
@@ -280,9 +280,9 @@ julia> all(p -> 0 < p < 1, y_prob[2,:]) # else DomainError
280
280
true
281
281
282
282
julia> y_hot = Flux.onehotbatch(y_bin, 0:1)
283
- 2×3 Flux.OneHotArray{2,2, Vector{UInt32}} :
284
- 0 1 0
285
- 1 0 1
283
+ 2×3 OneHotMatrix(:: Vector{UInt32}) with eltype Bool :
284
+ ⋅ 1 ⋅
285
+ 1 ⋅ 1
286
286
287
287
julia> Flux.crossentropy(y_prob, y_hot)
288
288
0.43989f0
0 commit comments