Skip to content

Commit cdd4fc9

Browse files
Merge #1750
1750: fix CI build r=CarloLucibello a=DhairyaLGandhi I also found another case with the RNNs on CPU that ought to work (it is taken from the CUDA tests): ```Julia julia> m = LSTM(10,5); julia> x = gpu(rand(10)) 10-element Vector{Float64}: 0.2904022727983888 0.3478379218964027 0.42768958025210235 0.04257538661918936 0.8775572608007804 0.9921291141175299 0.4817815828890992 0.5651999360978797 0.8793599163240291 0.6348052530428492 julia> m(x) ERROR: MethodError: no method matching (::Flux.LSTMCell{Matrix{Float32}, Vector{Float32}, Tuple{Matrix{Float32}, Matrix{Float32}}})(::Tuple{Matrix{Float32}, Matrix{Float32}}, ::Matrix{Float64}) Closest candidates are: (::Flux.LSTMCell{A, V, var"#s370"} where var"#s370"<:Tuple{AbstractMatrix{T}, AbstractMatrix{T}})(::Any, ::Union{AbstractVector{T}, AbstractMatrix{T}, Flux.OneHotArray}) where {A, V, T} at /Users/dhairyagandhi/Downloads/temp/bld/Flux.jl/src/layers/recurrent.jl:157 Stacktrace: [1] (::Flux.Recur{Flux.LSTMCell{Matrix{Float32}, Vector{Float32}, Tuple{Matrix{Float32}, Matrix{Float32}}}, Tuple{Matrix{Float32}, Matrix{Float32}}})(x::Matrix{Float64}) @ Flux ~/Downloads/temp/bld/Flux.jl/src/layers/recurrent.jl:47 [2] top-level scope @ REPL[13]:1 [3] top-level scope @ ~/.julia/packages/CUDA/YpW0k/src/initialization.jl:52 ``` Co-authored-by: Dhairya Gandhi <dhairya@juliacomputing.com>
2 parents 067f0b4 + 28aaabf commit cdd4fc9

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

test/cuda/curnn.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ using Flux, CUDA, Test
88
θ = gradient(() -> sum(m(x)), params(m))
99
@test x isa CuArray
1010
@test θ[m.cell.Wi] isa CuArray
11-
@test collect(m̄[].cell.Wi) == collect(θ[m.cell.Wi])
11+
@test collect(m̄.cell.Wi) == collect(θ[m.cell.Wi])
1212
end
1313

1414
@testset "RNN" begin

0 commit comments

Comments
 (0)