You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Metal does not support floating point atomic addition on shared memory (block-local), but right now KernalAbstractions.jl would lower to invalid instructions, which prevents kernel from running.
I'm happy to make a PR, but is there a rope I can follow in terms to adding a different lowering for this?