Now that Optimisers.jl is supported by FluxTraining.jl, FastAI.jl should move to using them. This means - default optimisers come from Optimisers.jl - docs are updated - parameter groups and layer freezing need to be updated