You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now, Flux and Lux layers are just tiny wrappers are the forward pass implementation that lives in GNNlib.jl. While it is nice to share the code, I think that the price we pay in code readability is too high.
Maybe we should just have duplicated implementations in GraphNeuralNetworks.jl and GNNLux.jl.